Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add support for MLX #5872

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open

feat: Add support for MLX #5872

wants to merge 5 commits into from

Conversation

albidev
Copy link

@albidev albidev commented Dec 27, 2024

☕️ Reasoning

Integrating MLX support into the gitbulter repository enhances its capabilities by expanding the range of supported AI engines. This addition allows users to leverage MLX's advanced features, improving the repository's versatility and performance in AI-related tasks.

🧢 Changes

  • Added MLX integration to support additional AI engines.
  • Updated configuration files to include MLX settings.
  • Implemented necessary code changes to ensure compatibility with MLX.

📌 Todos

  • Modifying documentation to reflect the new MLX support

Copy link

vercel bot commented Dec 27, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
gitbutler-components ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 8, 2025 7:52am

Copy link

vercel bot commented Dec 27, 2024

@albidev is attempting to deploy a commit to the GitButler Team on Vercel.

A member of the Team first needs to authorize it.

@Caleb-T-Owens
Copy link
Contributor

Hi @albidev, I'll take a read through when I'm back at work. Big thanks for the contribution

@albidev
Copy link
Author

albidev commented Jan 7, 2025

Thanks, @Caleb-T-Owens Let me know if you need anything when you’re back.

@Caleb-T-Owens
Copy link
Contributor

@albidev could you let me know how I can run an mlx model?

I've made a small patch to the settings page:

diff --git a/apps/desktop/src/routes/settings/ai/+page.svelte b/apps/desktop/src/routes/settings/ai/+page.svelte
index 4a2bef730..16115bab3 100644
--- a/apps/desktop/src/routes/settings/ai/+page.svelte
+++ b/apps/desktop/src/routes/settings/ai/+page.svelte
@@ -16,6 +16,7 @@
        import SectionCard from '@gitbutler/ui/SectionCard.svelte';
        import Spacer from '@gitbutler/ui/Spacer.svelte';
        import Textbox from '@gitbutler/ui/Textbox.svelte';
+       import { platform } from '@tauri-apps/plugin-os';
        import { onMount, tick } from 'svelte';
        import { run } from 'svelte/legacy';

@@ -172,6 +173,8 @@
        run(() => {
                if (form) form.modelKind.value = modelKind;
        });
+
+       const mlxEnabled = platform() === 'macos';
 </script>

 <SettingsPage title="AI options">
@@ -321,7 +324,7 @@

                <SectionCard
                        roundedTop={false}
-                       roundedBottom={modelKind !== ModelKind.Ollama}
+                       roundedBottom={modelKind !== ModelKind.Ollama && !mlxEnabled}
                        orientation="row"
                        labelFor="ollama"
                        bottomBorder={modelKind !== ModelKind.Ollama}
@@ -334,7 +337,7 @@
                        {/snippet}
                </SectionCard>
                {#if modelKind === ModelKind.Ollama}
-                       <SectionCard roundedTop={false} orientation="row" topDivider>
+                       <SectionCard roundedTop={false} orientation="row" topDivider roundedBottom={!mlxEnabled}>
                                <div class="inputs-group">
                                        <Textbox
                                                label="Endpoint"
@@ -347,31 +350,38 @@
                        </SectionCard>
                {/if}

-               <SectionCard
-                       roundedBottom={modelKind !== ModelKind.MLX}
-                       orientation="row"
-                       labelFor="mlx"
-                       bottomBorder={modelKind !== ModelKind.MLX}
-               >
-                       {#snippet title()}
-                               MLX
-                       {/snippet}
-                       {#snippet actions()}
-                               <RadioButton name="modelKind" id="custom" value={ModelKind.MLX} />
-                       {/snippet}
-               </SectionCard>
-               {#if modelKind === ModelKind.MLX}
-                       <SectionCard roundedTop={false} orientation="row" topDivider>
-                               <div class="inputs-group">
-                                       <Textbox
-                                               label="Endpoint"
-                                               bind:value={mlxEndpoint}
-                                               placeholder="http://localhost:8080"
-                                       />
-
-                                       <Textbox label="Model" bind:value={mlxModel} placeholder="mlx-community/Llama-3.2-3B-Instruct-4bit" />
-                               </div>
+               {#if mlxEnabled}
+                       <SectionCard
+                               roundedTop={false}
+                               roundedBottom={modelKind !== ModelKind.MLX}
+                               orientation="row"
+                               labelFor="mlx"
+                               bottomBorder={modelKind !== ModelKind.MLX}
+                       >
+                               {#snippet title()}
+                                       MLX
+                               {/snippet}
+                               {#snippet actions()}
+                                       <RadioButton name="modelKind" id="mlx" value={ModelKind.MLX} />
+                               {/snippet}
                        </SectionCard>
+                       {#if modelKind === ModelKind.MLX}
+                               <SectionCard roundedTop={false} orientation="row" topDivider>
+                                       <div class="inputs-group">
+                                               <Textbox
+                                                       label="Endpoint"
+                                                       bind:value={mlxEndpoint}
+                                                       placeholder="http://localhost:8080"
+                                               />
+
+                                               <Textbox
+                                                       label="Model"
+                                                       bind:value={mlxModel}
+                                                       placeholder="mlx-community/Llama-3.2-3B-Instruct-4bit"
+                                               />
+                                       </div>
+                               </SectionCard>
+                       {/if}
                {/if}
        </form>

@albidev
Copy link
Author

albidev commented Jan 7, 2025

@Caleb-T-Owens, in order to run mlx server you need to install mlx first and then run:

mlx_lm.server --model mlx-community/Llama-3.2-3B-Instruct-4bit

Useful mlx server documentation
https://github.com/ml-explore/mlx-examples/blob/main/llms/mlx_lm/SERVER.md

Let me know if you need further clarification 😊

@Caleb-T-Owens
Copy link
Contributor

@albidev Am I right in understanding that this is compatible with the OpenAI API interface? We've had another PR recently that adds a generic OpenAI option. If it's the case, I'll compare the two PRs, and either change this to be generic, or merge the other.

@albidev
Copy link
Author

albidev commented Jan 9, 2025

@Caleb-T-Owens Thanks for pointing this out! I think that MLX API is already compatible with OpenAI APIs. That said, I can still adapt this PR to be more generic if needed to better align with your expectations or other use cases.

Can you link the PR that you refer?

@Caleb-T-Owens
Copy link
Contributor

@albidev Here is the other PR: khanra17#1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants