Skip to content

Conversation

alex-ciobotea-dvloper
Copy link

@alex-ciobotea-dvloper alex-ciobotea-dvloper commented Aug 13, 2025

Adds a new integration guide: Ollama + LiteLLM.

  • New page: fern/integrations/ollama-litellm.mdx
  • Sidebar entry in fern/docs.yml under Integrations
  • Covers two paths:
    • Direct Ollama via ENABLE_OLLAMA, OLLAMA_SERVER_URL, OLLAMA_MODEL
    • OpenAI-compatible via LiteLLM using ENABLE_OPENAI_COMPATIBLE and OPENAI_COMPATIBLE_* env vars

Closes #242.


📚 This PR adds comprehensive documentation for integrating Skyvern with local LLMs via Ollama and OpenAI-compatible proxies like LiteLLM, providing users with cost-effective alternatives to expensive cloud-based models like GPT-4.

🔍 Detailed Analysis

Key Changes

  • Documentation Addition: New integration guide ollama-litellm.mdx with detailed setup instructions for both direct Ollama and LiteLLM proxy configurations
  • Navigation Update: Added sidebar entry in docs.yml under the Integrations section for easy discoverability
  • Configuration Coverage: Comprehensive documentation of environment variables for both integration paths (ENABLE_OLLAMA, OLLAMA_*, ENABLE_OPENAI_COMPATIBLE, OPENAI_COMPATIBLE_*)

Technical Implementation

flowchart TD
    A[User Setup] --> B{Choose Integration Path}
    B -->|Direct| C[Ollama Server]
    B -->|Proxy| D[LiteLLM Proxy]
    C --> E[Configure ENABLE_OLLAMA=true]
    C --> F[Set OLLAMA_SERVER_URL & MODEL]
    D --> G[Configure ENABLE_OPENAI_COMPATIBLE=true]
    D --> H[Set OPENAI_COMPATIBLE_* vars]
    E --> I[Start Skyvern]
    F --> I
    G --> I
    H --> I
    I --> J[Local LLM Integration Active]
Loading

Impact

  • Cost Reduction: Addresses the expensive GPT-4 usage issue mentioned in GitHub issues ($60 in 3 days), providing free local alternatives
  • Accessibility: Makes Skyvern more accessible to users who want to avoid cloud API costs or have privacy concerns
  • Flexibility: Supports both direct Ollama integration and OpenAI-compatible proxy setups, accommodating different user preferences and technical requirements
  • Community Response: Directly addresses community requests from issue How does skyvern integrate with ollama litellm #242 for Ollama/LiteLLM support with clear implementation guidance

Created with Palmier


Important

Adds Ollama + LiteLLM integration guide with configuration paths for Skyvern.

  • Documentation:
    • Adds ollama-litellm.mdx for Ollama + LiteLLM integration guide.
    • Updates docs.yml to include Ollama + LiteLLM in the Integrations sidebar.
  • Integration Paths:
    • Direct Ollama integration using ENABLE_OLLAMA, OLLAMA_SERVER_URL, OLLAMA_MODEL.
    • OpenAI-compatible integration via LiteLLM using ENABLE_OPENAI_COMPATIBLE and related env vars.

This description was created by Ellipsis for a4e2974. You can customize this summary. It will automatically update as commits are pushed.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Changes requested ❌

Reviewed everything up to a4e2974 in 2 minutes and 40 seconds. Click for details.
  • Reviewed 117 lines of code in 2 files
  • Skipped 0 files when reviewing.
  • Skipped posting 2 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. fern/docs.yml:153
  • Draft comment:
    Added navigation entry for the new integration. Please verify that the ordering within the Integrations section meets your intended convention.
  • Reason this comment was not posted:
    Confidence changes required: 33% <= threshold 50% None
2. fern/integrations/ollama-litellm.mdx:22
  • Draft comment:
    The comment 'The API is usually at http://localhost:11434.' appears within the bash code block. Consider closing the code block before this note or moving the note outside to avoid confusion.
  • Reason this comment was not posted:
    Decided after close inspection that this draft comment was likely wrong and/or not actionable: usefulness confidence = 10% vs. threshold = 50% The comment is technically correct - there is a formatting issue where explanatory text is mixed inside a code block. However, this is a documentation file and the issue is minor. The information is still clear and understandable to readers. The URL is actually useful to have near the commands. This feels like an overly pedantic formatting suggestion. The formatting inconsistency could cause confusion for users copy-pasting commands. Some users might accidentally try to run the note as a command. The risk is very low since the text clearly reads as a note, not a command, and is on a new line after the actual commands. Most users would understand this intuitively. While technically correct, this comment is too minor and pedantic for a documentation file. The current format is clear enough for readers.

Workflow ID: wflow_JoJ2avpDw74Jdnq0

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

@alex-ciobotea-dvloper
Copy link
Author

Thanks for the review!

I addressed the formatting suggestions:

  • Converted section titles to proper headings (“## B) OpenAI-compatible via LiteLLM”, “## Troubleshooting”, “## Internal References”).
  • Moved explanatory notes out of code fences and kept commands inside fenced blocks.
  • Grouped env snippets by scenario.
  • Added a Verify your setup section with quick curl checks for Ollama and LiteLLM.

Navigation:

  • Kept the new page at the end of the Integrations list in fern/docs.yml to match current ordering; happy to reorder if preferred.

Vars match code:

  • Ollama: ENABLE_OLLAMA, OLLAMA_SERVER_URL, OLLAMA_MODEL
  • OpenAI-compatible: ENABLE_OPENAI_COMPATIBLE, OPENAI_COMPATIBLE_MODEL_NAME, OPENAI_COMPATIBLE_API_KEY, OPENAI_COMPATIBLE_API_BASE, OPENAI_COMPATIBLE_API_VERSION, OPENAI_COMPATIBLE_SUPPORTS_VISION, OPENAI_COMPATIBLE_REASONING_EFFORT

@suchintan suchintan enabled auto-merge (squash) August 28, 2025 20:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

How does skyvern integrate with ollama litellm
2 participants