Skip to content

AI Gateway: Claude Pro, Copilot, Gemini subscriptions → OpenAI/Anthropic/Gemini APIs. No API keys needed.

License

Notifications You must be signed in to change notification settings

nghyane/llm-mux

Repository files navigation

llm-mux

AI Gateway for Subscription-Based LLMs

GitHub release GitHub stars License: MIT Docker Docs

Turn your Claude Pro, GitHub Copilot, and Gemini subscriptions into standard LLM APIs. No API keys needed.

Features

  • Multi-Provider — Claude, Copilot, Gemini, Codex, Qwen, Kiro, and more
  • Multi-Format — OpenAI, Anthropic, Gemini, Ollama compatible endpoints
  • Multi-Account — Load balance across accounts, auto-retry on quota limits
  • Zero Config — OAuth login, no API keys required

Quick Start

# Install
curl -fsSL https://raw.githubusercontent.com/nghyane/llm-mux/main/install.sh | bash

# Login to a provider
llm-mux --antigravity-login   # Google Gemini
llm-mux --claude-login        # Claude Pro/Max
llm-mux --copilot-login       # GitHub Copilot

# Start server
llm-mux

# Test
curl http://localhost:8317/v1/models

Usage

Base URL: http://localhost:8317
API Key:  unused (or any string)
# OpenAI format
curl http://localhost:8317/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model": "gemini-2.5-pro", "messages": [{"role": "user", "content": "Hello!"}]}'

Works with: Cursor, Aider, Claude Code, Cline, Continue, OpenCode, LangChain, Open WebUI, and any OpenAI/Anthropic/Gemini compatible tool.

Documentation

📖 https://nghyane.github.io/llm-mux/

License

MIT