-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛 bug: not work with LocalAI backend #143
Comments
@johnsmzr Could you post the section in the Mattermost config.json related to the AI plugin. Search for mattermost-ai it should have a "config" section directly underneath it (not the one that just says enabled) |
@crspeller Thank you for the reply!
I can see nothing in mattermost ai-chat. Yes, no response at all. |
update: I think this is the config you want:
|
@TheMasterFX The LocalAI backend loaded the model and generated the complete answer. Call Localai API:
whole response:
|
I’m seeing the exact same issue with LocalAI in my setup. Same versions as mentioned above. |
@johnsmzr Not seeing the same behavior when I try it. I don't see anything wrong with your configuration. |
@crspeller
I build the LocalAI as binary and run it locally on Macbook Pro M3.
I updated the mattermost-ai-plugin to 0.6.2. which contains major fixes and tested again. Now there are two types of errors: part of the LocalAI debug info:
error from mattermost server log:
I hope these information could be helpful. |
I encountered a similar issue, but I'm using a third-party relay API. {
"caller": "app/plugin_api.go:1000",
"level": "info",
"msg": "LLM Call",
"plugin_id": "mattermost-ai",
"prompt": "\n--- Conversation ---\n--- User ---\nWrite a short title for the following request. Include only the title and nothing else, no quotations. Request:\nBrainstorm ideas about \n--- Tools ---\n\n--- Context ---\nTime: \nServerName: \nCompanyName: \nPromptParameters:\n",
"timestamp": "2024-03-29 10:16:37.594 +08:00"
} |
same issue with Ollama as a backend 2024-09-14 01:57:07 {"timestamp":"2024-09-13 23:57:07.812 Z","level":"info","msg":"LLM Call","caller":"app/plugin_api.go:973","plugin_id":"mattermost-ai","prompt":"\n--- Conversation ---\n--- System ---\nYou are a helpful assistant called "Copilot" that responds on a Mattermost chat server called Mattermost owned by .\n\nCurrent time and date in the user's location is Sat, 14 Sep 2024 01:57:07 CEST\n\nThe following is the personal information of the user. This information is given with every request to you. You can use this information to taylor the request to the specific user however most of the time it will not be relavent. Only acknowledge the information when the request is directly related to the information provided. Never repeat it as written.\nThe user making the request username is 'admin'.\n--- User ---\nhallo\n--- Tools ---\n\n--- Context ---\nTime: Sat, 14 Sep 2024 01:57:07 CEST\nServerName: Mattermost\nCompanyName: \nRequestingUser: admin\nChannel: 47b77hm4i7yijcs6asfk4watge__r4msihq8ypf3jj34nnoffx9jxh\nPost: hr9jjay8ppncmgh1c89uh6rmkw\nPromptParameters:\n"} 2024-09-14 01:57:07 {"timestamp":"2024-09-13 23:57:07.815 Z","level":"error","msg":"Streaming result to post failed partway","caller":"app/plugin_api.go:976","plugin_id":"mattermost-ai","error":"error, status code: 404, message: json: cannot unmarshal number into Go value of type openai.ErrorResponse"} |
I've have the same issue here, even with the 1.0.0 version of mattermost-ai plugin. I was able to solve the |
Hi @novo-github, thanks for sharing your error. It looks like the request is reaching your model in Ollama but the model does not support tool usage or function calling. As a result, it's getting confused by the prompt, which uses some functions to fetch data about the user when the query is made. At the bottom of the model's configuration panel, try setting Disable Tools to true and see if that allows the model to respond. Alternatively, you may want to use a local model that is tuned to support function calling. |
@azigler "Disable tools to true" did the trick! Works without a hitch with Ollama deployed locally.. Thank you!
I see in the debug logs that when disable tools is set of False, the system prompt for the query is a little flawed. I don't have the logs with me now, but when I get it, I can update them here. |
That's right -- thanks @novo-github! I opened a PR to make sure these instructions end up in the docs: mattermost/docs#7413 |
Description
The AI-plugin currently does not work with LocalAI backend. It somehow cannot read the response from Localai-api correctly.
LocalAI version:
2.9.0 (latest)
AI Plugin version:
0.6.0 (latest)
Steps to reproduce
Call Localai API:
Response:
As you can see the response "Hello ..." is generated in "content" object, but the plugin does not read it.
There is no response in mattermost and also no error log in mattermost Server Logs(System Console).
The text was updated successfully, but these errors were encountered: