AI Gateway for Subscription-Based LLMs
Turn your Claude Pro, GitHub Copilot, and Gemini subscriptions into standard LLM APIs. No API keys needed.
- Multi-Provider — Claude, Copilot, Gemini, Codex, Qwen, Kiro, and more
- Multi-Format — OpenAI, Anthropic, Gemini, Ollama compatible endpoints
- Multi-Account — Load balance across accounts, auto-retry on quota limits
- Zero Config — OAuth login, no API keys required
# Install
curl -fsSL https://raw.githubusercontent.com/nghyane/llm-mux/main/install.sh | bash
# Login to a provider
llm-mux --antigravity-login # Google Gemini
llm-mux --claude-login # Claude Pro/Max
llm-mux --copilot-login # GitHub Copilot
# Start server
llm-mux
# Test
curl http://localhost:8317/v1/modelsBase URL: http://localhost:8317
API Key: unused (or any string)
# OpenAI format
curl http://localhost:8317/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model": "gemini-2.5-pro", "messages": [{"role": "user", "content": "Hello!"}]}'Works with: Cursor, Aider, Claude Code, Cline, Continue, OpenCode, LangChain, Open WebUI, and any OpenAI/Anthropic/Gemini compatible tool.
📖 https://nghyane.github.io/llm-mux/
- Installation — Install, update, uninstall
- Providers — All providers and login commands
- Configuration — Config file reference
- Integrations — Editor and framework setup
- Docker — Container deployment
MIT