-
Notifications
You must be signed in to change notification settings - Fork 295
Description
In docker logs librechat i get that errors:
2025-09-24 19:38:49 error: Invalid custom config file at /app/librechat.yaml:
{
"issues": [
{
"code": "invalid_type",
"expected": "array",
"received": "string",
"path": [
"endpoints",
"custom",
0,
"models",
"default"
],
"message": "Expected array, received string"
}
],
"name": "ZodError"
}
2025-09-24 19:38:49 warn: RAG API is either not running or not reachable at undefined, you may experience errors with file uploads.
librechat.yaml is defined:
version: "1.2.9"
endpoints:
custom:
- baseURL: "http://ollama:11434/v1"
apiKey: "" # Optional, leer lassen
models:
default: "deepseek-r1:8b-llama-distill-q4_K_M" # Standardmodell
options: # Optional: Liste der verfügbaren Modelle
- "deepseek-r1:8b-llama-distill-q4_K_M"
- "llama3.1:8b"
That is, why i want to use lokal LLM-Models. I find no way to do this.
My .env is:
---------------------------
Ports (redundant, aber für Konsistenz)
---------------------------
WEBUI_PORT=3000
OLLAMA_PORT=11434
LANGCHAIN_PORT=5005
---------------------------
Persistente Daten
---------------------------
OLLAMA_DATA=/home/milco/KI-Analysen/legal-analysis/docker/ollama-data
LIBRECHAT_DATA=/home/milco/KI-Analysen/legal-analysis/docker/librechat-data
MONGODB_DATA=/home/milco/KI-Analysen/legal-analysis/docker/mongodb-data
---------------------------
LibreChat Config
---------------------------
WEBUI_SECRET_KEY=----my Key----
JWT_SECRET=----my Key----
JWT_REFRESH_SECRET=----my Key----
OLLAMA_HOST=http://ollama:11434
OPENAI_API_BASE_URL=http://ollama:11434/v1
SEARXNG_HOST=http://searxng:8080
MONGO_URI=mongodb://librechat_user:---my User---:27017/librechat?authSource=admin
SKIP_EMAIL_VERIFICATION=true
ALLOW_REGISTRATION=true
RAG_DEBUG=true
---------------------------
RAG (optional, für spätere UI-Konfiguration)
---------------------------
RAG_ENABLED=true
RAG_API_HOST=http://ollama:11434/v1
RAG_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2
RAG_AUTO_SUMMARIZE=false
RAG_ENDPOINT=http://ollama:11434/v1
EMBEDDINGS_PROVIDER=ollama
OLLAMA_BASE_URL=http://host.docker.internal:11434
what is wrong?