Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Google manifold doesn't allow to use LearnLM #367

Open
bogdanr opened this issue Dec 12, 2024 · 1 comment
Open

Google manifold doesn't allow to use LearnLM #367

bogdanr opened this issue Dec 12, 2024 · 1 comment

Comments

@bogdanr
Copy link

bogdanr commented Dec 12, 2024

I have open-webui 0.4.8 with pipelines that were available at the time of the 0.4.8 release. They are all installed with Docker.
Gemini 2.0 Flash which was just released yesterday works fine but trying to use LearnLM produces this error:

Error: Invalid model name format: learnlm-1.5-pro-experimental

Docker logs doesn't show anything which I find relevant:

open-webui  | INFO:     188.25.145.71:0 - "POST /api/v1/memories/query HTTP/1.1" 200 OK
open-webui  | INFO  [open_webui.apps.openai.main] get_all_models()
pipelines   | INFO:     172.18.0.1:42978 - "GET /models HTTP/1.1" 200 OK
open-webui  | INFO  [open_webui.apps.ollama.main] get_all_models()
pipelines   | INFO:     172.18.0.1:42994 - "POST /google_genai.learnlm-1.5-pro-experimental/filter/inlet HTTP/1.1" 200 OK
pipelines   | google_genai.learnlm-1.5-pro-experimental
pipelines   | google_genai.learnlm-1.5-pro-experimental
pipelines   | INFO:     172.18.0.1:43008 - "POST /chat/completions HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "POST /api/chat/completions HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "POST /api/v1/chats/d28a614a-1ea5-418f-9564-9fe88496e640 HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
pipelines   | INFO:     172.18.0.1:43022 - "POST /google_genai.learnlm-1.5-pro-experimental/filter/outlet HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "POST /api/chat/completed HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "POST /api/v1/chats/d28a614a-1ea5-418f-9564-9fe88496e640 HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "POST /api/task/title/completions HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "POST /api/v1/chats/d28a614a-1ea5-418f-9564-9fe88496e640 HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "GET /api/v1/chats/d28a614a-1ea5-418f-9564-9fe88496e640/tags HTTP/1.1" 200 OK
open-webui  | Count of chats for tag 'science': 6
open-webui  | Count of chats for tag 'technology': 47
open-webui  | Count of chats for tag 'general': 12
open-webui  | INFO:     188.25.145.71:0 - "DELETE /api/v1/chats/d28a614a-1ea5-418f-9564-9fe88496e640/tags/all HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "GET /api/v1/chats/all/tags HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "POST /api/task/tags/completions HTTP/1.1" 200 OK
open-webui  | [] physics
open-webui  | INFO:     188.25.145.71:0 - "POST /api/v1/chats/d28a614a-1ea5-418f-9564-9fe88496e640/tags HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "GET /api/v1/chats/d28a614a-1ea5-418f-9564-9fe88496e640 HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "GET /api/v1/chats/all/tags HTTP/1.1" 200 OK
open-webui  | INFO:     188.25.145.71:0 - "GET /api/v1/chats/?page=1 HTTP/1.1" 200 OK
@tjbck tjbck transferred this issue from open-webui/open-webui Dec 12, 2024
@rotemdan
Copy link
Contributor

rotemdan commented Dec 14, 2024

I noticed that as well.

It's because of the model ID check in lines 101-102 in google_manifold_pipeline.py:

            if not model_id.startswith("gemini-"):
                return f"Error: Invalid model name format: {model_id}"

If the check is commented out, the model works:

Screenshot_24

I can make a pull request to disable the check, I guess. It's also possible to allow learnlm- as model ID prefix, but that kind of approach would also require manually adding that for every future Google model prefix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants