-
Notifications
You must be signed in to change notification settings - Fork 2k
Description
Hi there,
I am using openui out of pinokio (https://pinokio.computer/item?uri=https://github.com/pinokiofactory/openui).
As a LLM backend, i'm using Ollama in its current version 0.4.6.
If i want to try your tool, i get the error message "Error! 404 Error code: 404 - {'error': {'message': 'model "undefined" not found, try pulling it first', 'type': 'api_error', 'param': None, 'code': None}}" after sending a prompt.
As i try to set a different model, i noticed in the settings window, that the select box does not show any model names but only empty entries:
It does not matter, which of those entries i choose, the error persists.
If i quit Ollama and try to resolve the installed models, the seleciton is empty:
So the model resolution from Ollama seems to work at least partially (6 entries are correct, according to the 6 currently installed models).
My guess is, that openui is not able to resolve the information from the Ollama model list request correctly and further on, this leads to the upper error message.
Do you have any ideas, to solve this problem?
Thx :)