Skip to content

Latest commit

 

History

History
12 lines (10 loc) · 311 Bytes

ollama.mdx

File metadata and controls

12 lines (10 loc) · 311 Bytes
title
Ollama

You can use Ollama as LLM provider for your local development in RAGChat. To use an Ollama model, first initialize RAGChat with the Ollama model:

import { RAGChat, ollama } from "@upstash/rag-chat";
export const ragChat = new RAGChat({
  model: ollama("llama3.1"),
});