OllamaDesk serves as a graphical interface for Ollama using free LLMs. The backend server is responsible for model management and processing.
- Interactive chat interface
- Model selection
- Streamed responses
- Retry and stop functionality
- Settings for customization
- Copy and paste functionality
- Save answers to a text/markdown file
- Save chat history
- Dark/light mode
- Node.js (version 14 or higher)
- npm (Node Package Manager)
git clone https://github.com/skhelladi/OllamaDesk.git
cd OllamaDesk
npm install express
npm install sqlite3
npm install ollama
npm install crypto-js
npm install node-cache
npm install nodemon
npm install ip
Follow the instructions on the Ollama website to install Ollama and its required modules.
node server.mjs
The server will start and listen on port 3333 (default). You can access the application by navigating to http://localhost:3333
in your web browser.
npm run dev
This will start the server using nodemon
, which will automatically restart the server when file changes are detected.
- Send Message: Enter your message in the input field and click the send button to interact with the AI assistant.
- Stop Request: Click the stop button to stop the ongoing response generation.
- Retry: Click the retry button to resend the last user message and get a new response.
- Settings: Click the settings button to customize the stream, temperature, and model context.
This project is licensed under the GPL-3 license.
Unless you explicitly state otherwise, any contribution intentionally submitted by you for inclusion in this project shall be licensed as above, without any additional terms or conditions.
- Sofiane KHELLADI