-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
project is broken on linux #281
Comments
Why are you using host network? Follow the installation steps exactly and then follow this https://github.com/ItzCrazyKns/Perplexica?tab=readme-ov-file#ollama-connection-errors |
Well I tried following standard installation process and then following guide for connection error on linux. As I mentioned before it failed and I could not get connection to ollama from perplexica - no models were listed in the settings I use host network because this solution works, it is know issue connecting from docker containers to ollama on linux and was solved before with host network as described in openwebui: https://github.com/open-webui/open-webui?tab=readme-ov-file#open-webui-server-connection-error Could you please give some hint regarding my backend log file, I think troubleshooting it will give me ability to solve the problem. I suspect there is an issue with communication to searxng: https://github.com/user-attachments/files/16354344/backend_log.txt Thanks |
If you use host network then the inter-container communication (over the docker network) wouldn't take place. So the address |
If a container runs in host networking mode, then it's like a service running on localhost. It has no access to Docker container networks. It only sees what's running on localhost. So to connect to searxng it will have to connect to whatever port searxng is using on localhost. And if the port isn't exposed, it needs to be exposed. |
the docker-compose.yaml also does not work OOTB I have nothing running on port 4000 but docker compose up -d |
Try sending a curl request to localhost:4000 |
sorry my bad, appeared some othger service is there, moved port now it works OOTB |
corrected this to 127.0.0.1:8080 and it worked, thanks! |
Describe the bug
Several last updates made perplexica unusable on linux. I only get "Failed to connect to server. Please try again later" after doing search
With earlier version it was running though. Backend log is in attachments
backend_log.txt
To Reproduce
I run perplexica as docker on local vanilla Fedora linux. My ollama is run as systemd service. Searxng is run from a separate docker on port 4000, config of searxng includes json option
A month ago I had to do several modifications to make perplexica work even with old versions of perplexica due to communication error from docker to ollama. Here are the configs:
app.dockerfile.txt
config.toml.txt
docker-compose.yaml.txt
the most important was using host network
here are docker run commands I use:
docker run --restart always --name perplexica-backend -e SEARXNG_API_URL=http://127.0.0.1:4000 --network=host -p 3001:3001 perplexica-perplexica-backend
docker run -d --restart always --name perplexica-frontend --network=host -p 3000:3000 perplexica-perplexica-frontend
To make it clear, I again retried setting up perplexica using standard guidlines in the github and I do not get communication to ollama even. With my setup I can see ollama models in perplexicty settings but error "Failed to connect to server. Please try again later" persists
The text was updated successfully, but these errors were encountered: