-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama: Unexpected error during intent recognition #211
Comments
Same issue here |
Can you check home assistant log for errors related to conflicting dependencies? My best guess is that another integration you are using requires the most recent versions of |
Same issue here LLM Model 'fixt/home-3b-v3:q4_k_m' (remote) Temporarily fixed by removing additional attribute "rgb_color" (which exists by default). |
Same issue, here is the traceback:
|
This has been fixed in the develop branch as shown here, but I don't know how to use that in HACS (I couldn't figure it out). So I've taken my fork and made a tag and release |
Having this issue as well. Any chance we can get an updated release pushed out to fix this? @simcop2387 I appreciate the fork. |
Home-Assistant: 2024.8.1
llama-conversation: 0.3.6
webcolors: 1.13
Exact same issue was marked as fixed in 0.3.3 but appears not to be: #165
What is odd is that I looked at the webcolors package inside of home-assistant and the constant definitely exists and is exported.
Describe the bug
When performing a chat completion via the Assist Pipeline, the integration raises an AttributeError.
Expected behavior
The Assist Pipeline should be able to determine the intent.
Logs
If applicable, please upload any error or debug logs output by Home Assistant.
The text was updated successfully, but these errors were encountered: