Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama connection with Basic authorization #2213

Open
xieu90 opened this issue Sep 1, 2024 · 6 comments
Open

Ollama connection with Basic authorization #2213

xieu90 opened this issue Sep 1, 2024 · 6 comments
Assignees
Labels
enhancement New feature or request feature request Integration Request Request for support of a new LLM, Embedder, or Vector database

Comments

@xieu90
Copy link

xieu90 commented Sep 1, 2024

hi, usually ollama runs on local machine together with anythingllm so the url can be http://localhost:11434.
I have hosted ollama on a server and secured it.
now i can access it https://ollama.someserver.de. using firefox i can access it via https://username:[email protected]
doing so with anythingllm then error will appear:
Could not respond to message.
Request cannot be constructed from a URL that includes credentials: https://username:[email protected]/api/chat

are there anyway to get it work with anythingllm ?

@timothycarambat timothycarambat changed the title Feedback for “Ollama Embedder” Ollama connection with Basic authorization Sep 3, 2024
@timothycarambat timothycarambat transferred this issue from Mintplex-Labs/anythingllm-docs Sep 3, 2024
@timothycarambat
Copy link
Member

We do not have Ollama's request automatically set the head for basic Authorization requests since Ollama itself does not support authentication so build so would be a use case-specific integration.

I can understand the need to secure your Ollama instance running else where, but all LLM integrations that we support which do have authorization all use a Bearer type and not Basic.

@timothycarambat timothycarambat added enhancement New feature or request feature request Integration Request Request for support of a new LLM, Embedder, or Vector database labels Sep 3, 2024
@xieu90
Copy link
Author

xieu90 commented Sep 4, 2024

if i can inject the Bearer token then it might work. My other question is assume i managed to setup bearer token in server and it can later check/authenticate then how can i put the bearer in AnythingLLM to send it to server?
image
in postman or bruno sometimes i see the header or body text area to put bearer there, but in anythingllm according to image i have no idea how to put bearer into it (yet)

@timothycarambat timothycarambat self-assigned this Sep 4, 2024
@flefevre
Copy link

flefevre commented Sep 6, 2024

I have same difficulty with Ollama and Litellm connector. I have deployed both and i was trying to setp Anythingllm desktop version to link to those servers with
https://ollama-server.mylaborary.fr
and https://litellm-server.mylaboratory.fr

but it seems that AnythingLLM desktop do not allow to use https servers?

@timothycarambat
Copy link
Member

We do not limit http/https - that is not the issue. However, if you are using LiteLLM to relay the connection you can use the LiteLLM connector, which will use their input/output formatting. Regardless, your likely issue is port configuration. HTTPS is 443 and ollama runs on :11434 and unless you mapped 443 to forward to 11434, that is why the connection fails and is unrelated to this issue and is a configuration issue

@xieu90
Copy link
Author

xieu90 commented Sep 11, 2024

just for your info:

I tried to put token/password as parameter into url
https://ollama.someserver.de/?token=specialtoken

If token is same to the one i put in server before then it will let me through and i see ollama is running message in firefox.
If there is no token or wrong one is present then i will get 403 forbidden error in firefox.

so i went and put that https://ollama.someserver.de/?token=specialtoken into Ollama Base URL like in previous screenshot. Later in the chat I said hi and got this
Could not respond to message.
Ollama call failed with status code 403: <title>403 Forbidden</title>

403 Forbidden


nginx

@flefevre
Copy link

flefevre commented Sep 11, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature request Integration Request Request for support of a new LLM, Embedder, or Vector database
Projects
None yet
Development

No branches or pull requests

3 participants