Skip to content
/ ragpi Public

🤖 An open-source AI assistant answering questions using your docs

License

Notifications You must be signed in to change notification settings

ragpi/ragpi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Mar 23, 2025
dc582a1 · Mar 23, 2025
Feb 20, 2025
Mar 23, 2025
Feb 20, 2025
Feb 20, 2025
Dec 2, 2024
Mar 4, 2025
Mar 4, 2025
Jan 10, 2025
Jan 22, 2025
Mar 15, 2025
Feb 25, 2025
Mar 2, 2025
Feb 2, 2025
Mar 2, 2025

Repository files navigation

Ragpi

Ragpi is an open-source AI assistant that answers questions using your documentation, GitHub issues, and READMEs. It combines LLMs with intelligent search to provide relevant, documentation-backed answers through a simple API. It supports multiple providers like OpenAI, Ollama, and Deepseek, and has built-in integrations with Discord and Slack.

Documentation | API Reference | Join Discord

Key Features

  • 📚 Builds knowledge bases from docs, GitHub issues and READMEs
  • 🤖 Agentic RAG system for dynamic document retrieval
  • 🔌 Supports OpenAI, Ollama, Deepseek & OpenAI-Compatible models
  • 💬 Discord and slack integrations for community support
  • 🚀 API-first design with Docker deployment

Example Workflow

Here's a simple workflow to get started with Ragpi once it's deployed:

1. Set up a Source with a Connector

  • Use the /sources endpoint to configure a source with your chosen connector.
  • Each connector type has its own configuration parameters.

Example payload using the Sitemap connector:

{
  "name": "example-docs",
  "description": "Documentation for example project. It contains information about configuration, usage, and deployment.",
  "connector": {
    "type": "sitemap",
    "sitemap_url": "https://docs.example.com/sitemap.xml"
  }
}

2. Monitor Source Synchronization

  • After adding a source, documents will be synced automatically. You can monitor the sync process through the /tasks endpoint.

3. Chat with the AI Assistant

  • Use the /chat endpoint to query the AI assistant using the configured sources:

    {
      "sources": ["example-docs"],
      "messages": [
        { "role": "user", "content": "How do I deploy the example project?" }
      ]
    }
  • You can also interact with the AI assistant through the Discord or Slack integration.

Connectors

Ragpi supports the following connectors for building knowledge bases:

  • Documentation Website (Sitemap)
  • GitHub Issues
  • GitHub README Files

Explore connectors →

Providers

Ragpi supports the following LLM providers for generating responses and embeddings:

  • OpenAI (default)
  • Ollama
  • Deepseek
  • OpenAI-compatible APIs

Configure providers →

Integrations

Ragpi supports the following integrations for interacting with the AI assistant:

Contributing

Contributions to Ragpi are welcome! Please check out the contributing guidelines for more information.