Skip to content

Latest commit

 

History

History
81 lines (53 loc) · 1.96 KB

README.md

File metadata and controls

81 lines (53 loc) · 1.96 KB

RagBase - Private Chat with Your Documents

Completely local RAG with chat UI

Demo

Check out the RagBase on Streamlit Cloud. Runs with Groq API.

Installation

Clone the repo:

git clone [email protected]:curiousily/ragbase.git
cd ragbase

Install the dependencies (requires Poetry):

poetry install

Fetch your LLM (gemma2:9b by default):

ollama pull gemma2:9b

Run the Ollama server

ollama serve

Start RagBase:

poetry run streamlit run app.py

Architecture

Ingestor

Extracts text from PDF documents and creates chunks (using semantic and character splitter) that are stored in a vector databse

Retriever

Given a query, searches for similar documents, reranks the result and applies LLM chain filter before returning the response.

QA Chain

Combines the LLM with the retriever to answer a given user question

Tech Stack

Add Groq API Key (Optional)

You can also use the Groq API to replace the local LLM, for that you'll need a .env file with Groq API key:

GROQ_API_KEY=YOUR API KEY