Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support configuring the LLM/endpoint #2660

Closed
krlvi opened this issue Feb 14, 2024 · 14 comments · May be fixed by khanra17/gitbutler#1
Closed

Support configuring the LLM/endpoint #2660

krlvi opened this issue Feb 14, 2024 · 14 comments · May be fixed by khanra17/gitbutler#1
Labels
enhancement An improvement to an existing feature planned An enhancement that is on the roadmap

Comments

@krlvi
Copy link
Member

krlvi commented Feb 14, 2024

No description provided.

@mtsgrd mtsgrd added the enhancement An improvement to an existing feature label Feb 14, 2024
@technovangelist
Copy link

This would be awesome. You will need 3 things:

  • url
  • api key
  • model name

so, add a text box for the base of the url. So for Ollama, I would put in http://localhost:11434/. Some apps want the v1 at the end some just add it in the actual call. Then you would need the api key. If hosting openai on your own Azure, you need this with the url. Then choosing the model. Ollama doesn't yet have the openai compatible model list, so having an open text box would be great. I think that is all that is needed.

I can be available to try it out or its easy enough to download. I live on an island near Seattle and we sometimes lose internet, or if on the ferry I lose it too. having an offline solution that happens to have a better security and privacy story would be great.

@Leopere
Copy link

Leopere commented Feb 29, 2024

Before you read this comment, I'm very excited and grateful for the existence of this thing, but...

I will probably wait to consider this tool until they decide this is a feature. I don't think it's helpful until then for me at least. It does look hella promising, though. The benefit of Git and FOSS in the first place is decentralization. No sense in siloing your stuff for a couple of features.

@krlvi
Copy link
Member Author

krlvi commented Feb 29, 2024

Hey, thanks for this feedback, @Leopere. This is definitely a feature we want in the tool, it's just a matter of a little bit of time to have this implemented and integrated. Will update this ticket with updates

@Leopere
Copy link

Leopere commented Feb 29, 2024

Hey, thanks for this feedback, @Leopere. This is definitely a feature we want in the tool, it's just a matter of a little bit of time to have this implemented and integrated. Will update this ticket with updates

I'm actually over the moon about this it looks like a super fun tool.

@krlvi
Copy link
Member Author

krlvi commented Mar 18, 2024

Thanks to @Caleb-T-Owens this will be supported in the next release 🚀
#3084

When that's merged we can do some testing with Ollama as a custom endpoint :)

@Caleb-T-Owens
Copy link
Contributor

Hi! So, you won't yet be able to provide a custom endpoint, only choose between anthropics and openAI with some more configuration options.

Adding in support for custom endpoints would be relatively easy, and it probably would make sense (assuming it's sensible) to make the data structures we expect from a custom API to align with ollamaz

@Leopere
Copy link

Leopere commented Mar 20, 2024

Hi! So, you won't yet be able to provide a custom endpoint, only choose between anthropics and openAI with some more configuration options.

Adding in support for custom endpoints would be relatively easy, and it probably would make sense (assuming it's sensible) to make the data structures we expect from a custom API to align with ollamaz

it would be likely possible from what I glanced in that PR.

@Caleb-T-Owens
Copy link
Contributor

@Leopere Adding an AIClient to consume your own endpoint is quite simple, but there are some prerequisite changes that need to be made for the weaker open source models to be anywhere near as effective.

@bioshazard
Copy link

bioshazard commented Mar 27, 2024

Just add support for overriding the OpenAI base url and let the user deal with the quality of the result. I would hope as well that you would expose a way for me to customize the prompt so I can tweak it to work well with my model of choice. I would point it to a wrapper I made around AWS Bedrock for enterprise use.

@Leopere
Copy link

Leopere commented Mar 27, 2024

Delete this

@Caleb-T-Owens
Copy link
Contributor

Hi @bioshazard,

Just add support for overriding the OpenAI base url and let the user deal with the quality of the result

There is an emphasis on delivering high quality features that should be a joy to use rather than something that requires you to wrestle with it. When I was doing my initial testing with the 70b llama models via ollama, with our default prompts I was getting responses varying from tutorials to write guess the number to generating random patches 😆. Certainly not plug and play, or a joy to use!

Having the ability to use local models that I personally and the team at GitButler are interested in having so please have patience as we work to deliver it in the best way possible.

I would hope as well that you would expose a way for me to customize the prompt so I can tweak it to work well with my model of choice.

100% this is a planned feature and something we're quite excited to add. It's one of the features I've noted as one of the dependencies of adding custom endpoints because when I was doing my spike into, I needed to make various model specific tweaks to help tune the responses.

Here is the issue I made for tracking those prerequisites (#3255)

@bioshazard
Copy link

bioshazard commented Mar 28, 2024

Thanks for the detailed response. Fair enough. I'll probably point it at a 7b OrcaMistral model

@Byron Byron added the planned An enhancement that is on the roadmap label Apr 21, 2024
@krlvi
Copy link
Member Author

krlvi commented Jun 1, 2024

Hello! I am happy to share that support for local endpoints is now part of v0.12.0

Release notes: https://discord.com/channels/1060193121130000425/1183737922785116161/1246475064006807575

Huge thanks to @estib-vega for this <3

@krlvi krlvi closed this as completed Jun 1, 2024
@Leopere
Copy link

Leopere commented Jun 2, 2024

Oh cool I’ll revisit this soon ❤️❤️

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement An improvement to an existing feature planned An enhancement that is on the roadmap
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants