-
Notifications
You must be signed in to change notification settings - Fork 558
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add support for MLX #5872
base: master
Are you sure you want to change the base?
Conversation
…stributed as MLXClient enum to mirror Ollama client functionality, leveraging MLX API endpoints and types.
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
@albidev is attempting to deploy a commit to the GitButler Team on Vercel. A member of the Team first needs to authorize it. |
Hi @albidev, I'll take a read through when I'm back at work. Big thanks for the contribution |
Thanks, @Caleb-T-Owens Let me know if you need anything when you’re back. |
@albidev could you let me know how I can run an mlx model? I've made a small patch to the settings page: diff --git a/apps/desktop/src/routes/settings/ai/+page.svelte b/apps/desktop/src/routes/settings/ai/+page.svelte
index 4a2bef730..16115bab3 100644
--- a/apps/desktop/src/routes/settings/ai/+page.svelte
+++ b/apps/desktop/src/routes/settings/ai/+page.svelte
@@ -16,6 +16,7 @@
import SectionCard from '@gitbutler/ui/SectionCard.svelte';
import Spacer from '@gitbutler/ui/Spacer.svelte';
import Textbox from '@gitbutler/ui/Textbox.svelte';
+ import { platform } from '@tauri-apps/plugin-os';
import { onMount, tick } from 'svelte';
import { run } from 'svelte/legacy';
@@ -172,6 +173,8 @@
run(() => {
if (form) form.modelKind.value = modelKind;
});
+
+ const mlxEnabled = platform() === 'macos';
</script>
<SettingsPage title="AI options">
@@ -321,7 +324,7 @@
<SectionCard
roundedTop={false}
- roundedBottom={modelKind !== ModelKind.Ollama}
+ roundedBottom={modelKind !== ModelKind.Ollama && !mlxEnabled}
orientation="row"
labelFor="ollama"
bottomBorder={modelKind !== ModelKind.Ollama}
@@ -334,7 +337,7 @@
{/snippet}
</SectionCard>
{#if modelKind === ModelKind.Ollama}
- <SectionCard roundedTop={false} orientation="row" topDivider>
+ <SectionCard roundedTop={false} orientation="row" topDivider roundedBottom={!mlxEnabled}>
<div class="inputs-group">
<Textbox
label="Endpoint"
@@ -347,31 +350,38 @@
</SectionCard>
{/if}
- <SectionCard
- roundedBottom={modelKind !== ModelKind.MLX}
- orientation="row"
- labelFor="mlx"
- bottomBorder={modelKind !== ModelKind.MLX}
- >
- {#snippet title()}
- MLX
- {/snippet}
- {#snippet actions()}
- <RadioButton name="modelKind" id="custom" value={ModelKind.MLX} />
- {/snippet}
- </SectionCard>
- {#if modelKind === ModelKind.MLX}
- <SectionCard roundedTop={false} orientation="row" topDivider>
- <div class="inputs-group">
- <Textbox
- label="Endpoint"
- bind:value={mlxEndpoint}
- placeholder="http://localhost:8080"
- />
-
- <Textbox label="Model" bind:value={mlxModel} placeholder="mlx-community/Llama-3.2-3B-Instruct-4bit" />
- </div>
+ {#if mlxEnabled}
+ <SectionCard
+ roundedTop={false}
+ roundedBottom={modelKind !== ModelKind.MLX}
+ orientation="row"
+ labelFor="mlx"
+ bottomBorder={modelKind !== ModelKind.MLX}
+ >
+ {#snippet title()}
+ MLX
+ {/snippet}
+ {#snippet actions()}
+ <RadioButton name="modelKind" id="mlx" value={ModelKind.MLX} />
+ {/snippet}
</SectionCard>
+ {#if modelKind === ModelKind.MLX}
+ <SectionCard roundedTop={false} orientation="row" topDivider>
+ <div class="inputs-group">
+ <Textbox
+ label="Endpoint"
+ bind:value={mlxEndpoint}
+ placeholder="http://localhost:8080"
+ />
+
+ <Textbox
+ label="Model"
+ bind:value={mlxModel}
+ placeholder="mlx-community/Llama-3.2-3B-Instruct-4bit"
+ />
+ </div>
+ </SectionCard>
+ {/if}
{/if}
</form> |
@Caleb-T-Owens, in order to run mlx server you need to install mlx first and then run:
Useful mlx server documentation Let me know if you need further clarification 😊 |
@albidev Am I right in understanding that this is compatible with the OpenAI API interface? We've had another PR recently that adds a generic OpenAI option. If it's the case, I'll compare the two PRs, and either change this to be generic, or merge the other. |
@Caleb-T-Owens Thanks for pointing this out! I think that MLX API is already compatible with OpenAI APIs. That said, I can still adapt this PR to be more generic if needed to better align with your expectations or other use cases. Can you link the PR that you refer? |
@albidev Here is the other PR: khanra17#1 |
☕️ Reasoning
Integrating MLX support into the
gitbulter
repository enhances its capabilities by expanding the range of supported AI engines. This addition allows users to leverage MLX's advanced features, improving the repository's versatility and performance in AI-related tasks.🧢 Changes
📌 Todos