Skip to content

Conversation

@JimiHFord
Copy link

@JimiHFord JimiHFord commented Dec 30, 2025

🚀 Multi-SCM Provider Support & LiteLLM Integration

Summary

This PR introduces two major enhancements:

  • Multi-SCM Provider Support: Adds Bitbucket Server alongside existing GitLab integration using a strategy pattern.
  • LiteLLM Integration: Replaces direct pydantic-ai provider usage with LiteLLM for unified LLM provider support (100+ providers).

New Features

🔌 SCM Provider Abstraction

  • Strategy pattern implementation with abstract SCMProvider base class.
  • GitLab provider: Refactored existing GitLab code into a dedicated provider.
  • Bitbucket Server provider: New implementation using atlassian-python-api.
  • Provider-agnostic terminology: Repository (not project), Pull Request (not MR), Namespace (not group).
  • Factory pattern for provider instantiation based on SCM_PROVIDER environment variable.

🤖 LiteLLM Integration

  • Unified LLM interface supporting 100+ providers (OpenAI, Azure, Anthropic, Bedrock, Vertex, Ollama, etc.).
  • Simplified configuration: Provider inferred from model name prefix (e.g., azure/gpt-4o, anthropic/claude-3-opus).
  • Automatic quirk handling: LiteLLM manages provider-specific differences like max_tokens vs. max_completion_tokens.
  • Built-in retry support: Removed custom retry_client.py in favor of LiteLLM's native retry handling (num_retries parameter). LiteLLM automatically handles rate limits (429), server errors (500/502/503/504), and respects Retry-After headers.

Files Changed

Category Files
New SCM Providers scm_providers/ (base, factory, gitlab, bitbucket_server)
New LLM Module llm/ (litellm_model, factory)
Updated Agents analyzer.py, documenter.py, ai_rules_generator.py
Updated Config config.py, .env.sample
Refactored handlers/cronjob.py (now uses SCM interface)
Removed utils/retry_client.py (replaced by LiteLLM retries)

Configuration

SCM Provider Selection

SCM_PROVIDER=gitlab  # or bitbucket_server

LLM Provider Examples

# Each agent can use a different provider
ANALYZER_LLM_MODEL=azure/gpt-4o-deployment    # Uses AZURE_* env vars
DOCUMENTER_LLM_MODEL=claude-sonnet-4-20250514 # Uses ANTHROPIC_API_KEY
AI_RULES_LLM_MODEL=gpt-4o                     # Uses OPENAI_API_KEY

Deprecations

  • --group-project-id CLI arg deprecated in favor of --namespace-id for provider-agnostic terminology. Still works but emits deprecation warning. Will be removed in v2.0.

Dependencies Added

  • litellm>=1.55.0: Unified LLM provider interface.
  • atlassian-python-api>=3.41.0: Bitbucket Server API client.

Testing

  • ✅ Analysis tested with Azure OpenAI via LiteLLM.
  • ✅ All 5 analyzer agents completing successfully.
  • ✅ Linting passes (ruff check src/).
  • ✅ Deprecation warning verified for --group-project-id

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant