-
Notifications
You must be signed in to change notification settings - Fork 298
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docs - Add example using LiteLLM Proxy to call Mistral AI Models #75
base: main
Are you sure you want to change the base?
Conversation
hi @sophiamyang can you review this PR? Happy to make any changes necessary |
third_party/LiteLLM/README.md
Outdated
[Use with Langchain, LlamaIndex, Instructor, etc.](https://docs.litellm.ai/docs/proxy/user_keys) | ||
|
||
```bash | ||
import openai |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you use Mistral SDK instead of OpenAI please?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated @sophiamyang - now examples use the Mistral SDK
following up on this @sophiamyang - any other changes ? |
Hi ishaan, I dont think this will work, our current SDK Python Client does not have the same methods and behavior as OpenAIs SDKs, it seems you are using chat.completions.create and other methods that do not current exist with the current SDK, possible to update it? And thank you for the notebook! 🙏 |
Hi @ishaan-jaff, I think your code with Mistral client still has issues. Could you help update the code? You can see our docs here. |
acknowledging this, will add this to my backlog @sophiamyang |
Doc - Add example on using Mistral models with LiteLLM Proxy
Hi, I'm the maintainer of LiteLLM - made a PR to show how to use LiteLLM Proxy to call Mistral AI models
Why use LiteLLM Proxy ?
Use LiteLLM Proxy for:
Works for Mistral AI API + Codestral API + Bedrock