Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OLLAMA_API_BASE not taking effect when running without Docker serve, but in a container #1690

Open
jonathanortega2023 opened this issue Dec 12, 2024 · 0 comments

Comments

@jonathanortega2023
Copy link

Might be related to #987 I think this issue is still persisting for non-docker serving users running in a container. I have R2R setting up in a python3.12-slim image and OLLAMA_API_BASE set in env but still could not access external Ollama.

(r2r) root@569cd9229721:~# r2r serve --config-path=my_r2r.toml
Spinning up an R2R deployment...
Running on 0.0.0.0:7272, with docker=False
2024-12-12 04:28:38,303 - INFO - root - Initializing EmbeddingProvider with config app=AppConfig(project_name=None) extra_fields={} provider='ollama' base_model='ollama/mxbai-embed-large' base_dimension=1024 rerank_model=None rerank_url=None batch_size=32 prefixes=None add_title_as_prefix=True concurrent_request_limit=256 max_retries=8 initial_backoff=1 max_backoff=64.0 quantization_settings=VectorQuantizationSettings(quantization_type=<VectorQuantizationType.FP32: 'FP32'>) rerank_dimension=None rerank_transformer_type=None.
2024-12-12 04:28:38,304 - INFO - root - Using Ollama API base URL: http://ollama-api:11434
.........
Warning: Unable to connect to external Ollama instance. Error: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/version (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x76cffd170da0>: Failed to establish a new connection: [Errno 111] Connection refused'))
Please ensure Ollama is running externally if you've excluded it from Docker and plan on running Local LLMs.
Do you want to continue without confirming an `Ollama` connection? [y/N]:

I was able to successfully serve by altering lines 215-219 of
/r2r/lib/python3.12/site-packages/cli/utils/docker_utils.py

def check_external_ollama(ollama_url="http://localhost:11434/api/version"):

    if model_provider == "ollama":
        check_external_ollama(os.environ.get("OLLAMA_API_BASE"))


def check_external_ollama(ollama_url=None):
    if ollama_url is None:
        ollama_url = "http://localhost:11434/api/version"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant