Skip to content

Conversation

ishaan-jaff
Copy link
Contributor

Title

Include model_id in Router Error Messages for Enhanced Debugging

Relevant issues

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
    Note: An existing test (test_router_fallbacks_with_model_id) was identified as covering this change, but no new test was explicitly added during the session.
  • I have added a screenshot of my new test passing locally
    Note: No new test was added, so no screenshot is available.
  • My PR passes all unit tests on make test-unit
    Note: make test-unit was not run during the session.
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature
🐛 Bug Fix

Changes

This PR enhances the error messages generated by the LiteLLM Router's fallback mechanism to include the specific model_id when a model group fails.

Why this change?
Previously, error messages only displayed the model_group, making it difficult to pinpoint the exact failing model in scenarios where a model group contains multiple deployments or specific model IDs. By including the model_id, debugging becomes significantly easier, allowing developers to quickly identify and address issues with individual model deployments.

How it works:
The model_id is extracted from kwargs.get("model_info", {}).get("id") within the router's exception handling. If available, it is appended to the error message. This change is backward compatible; if no model_id is present, the error message format remains unchanged.

Example Error Message Enhancement:

Before:

Received Model Group=gpt-3.5-turbo
Available Model Group Fallbacks=None

After:

Received Model Group=gpt-3.5-turbo
Available Model Group Fallbacks=None
Received Model ID=model-id-12345

Slack Thread

Open in Cursor Open in Web

Copy link

cursor bot commented Sep 11, 2025

Cursor Agent can help with this pull request. Just @cursor in comments and I'll start working on changes in this branch.
Learn more about Cursor Agents

Copy link

vercel bot commented Sep 11, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
litellm Ready Ready Preview Comment Sep 11, 2025 10:18pm

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants