Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suppport Codestral Mamba #465

Closed
robert-zaremba opened this issue Nov 5, 2024 · 1 comment
Closed

Suppport Codestral Mamba #465

robert-zaremba opened this issue Nov 5, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@robert-zaremba
Copy link

robert-zaremba commented Nov 5, 2024

Codestral, while having a great output quality is really slow.

Codestral Mamba looks to improve without sacrificing the result quality: https://mistral.ai/news/codestral-mamba/

Is there any way we can have support for Codestral Mamba? Codestral Mamba doesn't llama.cpp support (neither ollama).
So checking here if there is other viable solution that can be implemented to support Codestral Mamba.

@robert-zaremba robert-zaremba added the enhancement New feature or request label Nov 5, 2024
@karthink
Copy link
Owner

karthink commented Nov 5, 2024

First you'd need to deploy the model somehow, neither Emacs nor gptel can help with that. Once it's running gptel can support it if the deployment offers a HTTP API endpoint.

Repository owner locked and limited conversation to collaborators Jan 2, 2025
@karthink karthink converted this issue into discussion #538 Jan 2, 2025

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants