Help with local ollama setup? #79
jasdjensen
started this conversation in
General
Replies: 1 comment 2 replies
-
|
Please see these comments for now: |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I've tried to setup a connection to my local ollama. I can select a model using http://192.168.1.106:11434/v1 but can't get any further. I've tried putting in a dummy APIkey but that doesn't help.
This works from the docker host. (to test connectivity and my ollama instance is running)
My log looks good:
Session validated for user: admin
Authentication successful, proceeding with request
Saving userData (0 TV recommendations, 0 movie recommendations)
Writing 1808 bytes to /app/server/data/user_data.json
Deleted existing /app/server/data/user_data.json
Successfully wrote new /app/server/data/user_data.json
✓ /app/server/data/user_data.json exists, size=1808 bytes, modified=Thu Mar 13 2025 04:11:33 GMT+0000 (Coordinated Universal Time)
✓ Read back file, content length: 1808 bytes
✓ Verified file contains properly encrypted data
✓ Decrypted data has 0 TV recommendations
✓ Decrypted data has 0 movie recommendations
Not saying there is a bug, but I'm having trouble. Webpage keeps returning to Settings / AI Service tab. I can select different models and save but can't get off that screen to run recommendations.
Beta Was this translation helpful? Give feedback.
All reactions