Skip to content

feat(opencode): add auto loading models for litellm providers#13896

Open
farukcankaya wants to merge 1 commit intoanomalyco:devfrom
farukcankaya:auto-load-available-models-for-litellm
Open

feat(opencode): add auto loading models for litellm providers#13896
farukcankaya wants to merge 1 commit intoanomalyco:devfrom
farukcankaya:auto-load-available-models-for-litellm

Conversation

@farukcankaya
Copy link

What does this PR do?

Fixes #13891

This PR adds auto loading models feature for LiteLLM proxy providers in OpenCode. When a provider is configured with both litellmProxy: true and autoload: true options, OpenCode will automatically fetch available models from the LiteLLM proxy's /model/info endpoint (with fallback to /models endpoint). This eliminates the need to manually define every model in opencode.json when using LiteLLM proxies that may have many models.

How did you verify your code works?

  • Unit tests
  • Manual testing e.g. for opencode.json
  "$schema": "https://opencode.ai/config.json",
  "model": "InternalLiteLLMGateway/anthropic/claude-opus-4-5",
  "provider": {
    "InternalLiteLLMGateway": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "InternalLiteLLMGateway",
      "options": {
        "baseURL": "https://litellm.internal",
        "litellmProxy": true,
        "autoload": true,
        "apiKey": "sk-xxx",
        "headers": {
          "Authorization": "Bearer sk-xxx"
        }
      }
    }
  }
}
litellm-auto-model-loader.mp4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE]: Auto-load available models from LiteLLM proxy with autoload option

1 participant