Command suggestions for Zsh using Ollama. Uses local LLMs to suggest commands based on your input, history, and current directory.
- Suggests commands as you type
- Uses command history and directory contents for context
- Processes suggestions asynchronously
- Two modes: realtime (as you type) or manual (on demand)
- Configurable settings for model, suggestions, and context
- Zsh shell
- Ollama installed and running
git
for installationcurl
for API requestsjq
for JSON parsingls
,awk
,tail
,head
,grep
for directory and text processingzsh-async
(included as submodule)
- Clone this repository into your Oh My Zsh custom plugins directory:
git clone --recursive https://github.com/realies/zsh-ollama-suggest.git ${ZSH_CUSTOM:-~/.oh-my-zsh/custom}/plugins/zsh-ollama-suggest
- Add the plugin to your
.zshrc
:
plugins=(... zsh-ollama-suggest)
- Clone the repository with submodules:
git clone --recursive https://github.com/realies/zsh-ollama-suggest.git
- Source the plugin in your
.zshrc
:
source /path/to/zsh-ollama-suggest/zsh-ollama-suggest.plugin.zsh
- Start typing a command (suggestions appear after 2 characters)
- Suggestions will appear below your input:
- Automatically in realtime mode
- On demand with Ctrl+X c in manual mode
- Navigation:
- Up/Down arrows to cycle through suggestions
- Ctrl+C to clear suggestions and reset
- Ctrl+X t to toggle between realtime/manual modes
- Ctrl+X c to manually trigger suggestions
The following variables can be set in your .zshrc
before sourcing the plugin:
# Ollama model to use (default: llama3.2:3b)
typeset -g ZSH_OLLAMA_SUGGEST_MODEL='llama3.2:3b'
# Ollama server URL (default: http://localhost:11434)
typeset -g ZSH_OLLAMA_SUGGEST_URL='http://localhost:11434'
# Maximum number of suggestions to show (default: 5)
typeset -g ZSH_OLLAMA_SUGGEST_MAX_SUGGESTIONS=5
# Number of history entries to consider (default: 1000)
typeset -g ZSH_OLLAMA_SUGGEST_HISTORY_SIZE=1000
# Model temperature for suggestion diversity (default: 0.1)
typeset -g ZSH_OLLAMA_SUGGEST_TEMPERATURE=0.1
# Number of directory entries to show in context (default: 25)
typeset -g ZSH_OLLAMA_SUGGEST_DIR_LIST_SIZE=25
# Suggestion mode: 'realtime' or 'manual' (default: realtime)
typeset -g ZSH_OLLAMA_SUGGEST_MODE='realtime'
# Debug logging (default: 0)
typeset -g ZSH_OLLAMA_SUGGEST_DEBUG=0
If you encounter issues:
- Enable debug mode by setting
ZSH_OLLAMA_SUGGEST_DEBUG=1
in your.zshrc
- Debug logs will be written to
/tmp/zsh-ollama-suggest.log
with millisecond timestamps - Ensure Ollama is running and accessible at your configured URL
- Verify your model is downloaded (
ollama list
) - Check the log file for detailed operation tracing
MIT License - see LICENSE for details.
- Ollama for the local LLM runtime
- zsh-async for async processing
- zsh-autosuggestions for inspiration