Yii2 component for Ollama API with optional vector database support (e.g., Qdrant) for Retrieval-Augmented Generation (RAG).
- Connect to Ollama API (
llama2
,mistral
,gemma
) - Optional vector DB integration for context injection
- Supports Yii2 HTTP Client
- Easy configuration via Yii2 components
- Multilingual exception messages (
Yii::t()
) - Events support:
beforeGenerate
,afterGenerate
,generateError
- Request and User automatically included in events for easy logging and auditing
Install via Composer:
composer require strtob/yii2-ollama
Ensure you have yiisoft/yii2-httpclient and a Qdrant PHP client installed if you want vector DB support.
Migration:
'controllerMap' => [
'migrate' => [
'class' => 'yii\console\controllers\MigrateController',
'migrationNamespaces' => [
'strtob\yii2Ollama\migrations',
],
],
],
yii migrate
Example config/web.php
using a Qdrant adapter:
use strtob\yii2Ollama\QdrantAdapter;
$qdrantAdapter = new QdrantAdapter($qdrantClient, 'my_collection');
'components' => [
'ollama' => [
'class' => 'strtob\yii2Ollama\OllamaComponent',
'apiUrl' => 'http://localhost:11434/v1/generate',
'apiKey' => 'MY_SECRET_TOKEN',
'model' => \strtob\yii2Ollama\OllamaComponent::MODEL_MISTRAL,
'temperature' => 0.7,
'maxTokens' => 512,
'topP' => 0.9,
'vectorDb' => $qdrantAdapter, // must implement VectorDbInterface
'vectorDbTopK' => 5,
],
];
try {
$prompt = "Explain RAG with vector DB.";
$text = \Yii::$app->ollama->generateText($prompt);
echo $text;
//or
print_r(generateTextWithTokens(string $prompt, $options = []))
} catch (\yii\base\InvalidConfigException $e) {
echo Yii::t('yii2-ollama', 'Configuration error: {message}', ['message' => $e->getMessage()]);
} catch (\strtob\yii2Ollama\OllamaApiException $e) {
echo Yii::t('yii2-ollama', 'API request failed: {message}', ['message' => $e->getMessage()]);
}
See example:
public function actionIndex()
{
$response = Yii::$app->response;
$response->format = \yii\web\Response::FORMAT_RAW;
Yii::$app->response->isSent = true;
// Output-Puffer leeren
while (ob_get_level())
ob_end_clean();
ob_implicit_flush(true);
Yii::$app->ollama->stream = true;
try {
Yii::$app->ollama->generate("Whats up?", [], function ($chunk) {
echo $chunk;
flush();
});
} catch (\Throwable $e) {
echo "\n[Error]: " . $e->getMessage();
}
}
OllamaComponent
supports three main events during generation:
Event | When Triggered | Data Included |
---|---|---|
beforeGenerate |
Before sending a request to the Ollama API | prompt , options , request , user |
afterGenerate |
After receiving a successful response | prompt , options , request , user , response |
generateError |
When an exception occurs during generation | prompt , options , request , user , exception |
- Connect to Ollama API (
llama2
,mistral
,gemma
) - Optional vector DB integration for context injection
- Supports Yii2 HTTP Client
- Embedding generation (
embedText
) - Easy configuration via Yii2 components
- Multilingual exception messages (
Yii::t()
) - Events support:
beforeGenerate
,afterGenerate
,generateError
- Request and User automatically included in events for easy logging and auditing
// Log after generation
\Yii::$app->ollama->on(\strtob\yii2Ollama\OllamaComponent::EVENT_AFTER_GENERATE, function($event) {
Yii::info("Prompt generated: {$event->data['prompt']}", 'ollama');
Yii::info("User: " . ($event->data['user']->username ?? 'guest'), 'ollama');
});
// Handle errors
\Yii::$app->ollama->on(\strtob\yii2Ollama\OllamaComponent::EVENT_GENERATE_ERROR, function($event) {
Yii::error("Generation failed for prompt: {$event->data['prompt']}", 'ollama');
Yii::error("Exception: " . $event->data['exception']->getMessage(), 'ollama');
});
try {
$text = "Llamas are members of the camelid family.";
$embeddingResult = \Yii::$app->ollama->embedText($text);
echo "Embedding vector:\n";
print_r($embeddingResult['embedding']); // Nur die Embeddings anzeigen
} catch (\yii\base\InvalidConfigException $e) {
echo "Configuration error: " . $e->getMessage();
} catch (\strtob\yii2Ollama\OllamaApiException $e) {
echo "Embedding request failed: " . $e->getMessage();
}
You can use an ActiveRecord model to handle documents and automatically generate embeddings for them in your vector database. Supports PDF, TXT, DOCX uploads.
use app\models\DocumentModel;
use yii\web\UploadedFile;
// 1) Create a new document
$doc = new DocumentModel();
$doc->title = 'Sample PDF';
$doc->user_id = 1;
$doc->uploadedFile = UploadedFile::getInstance($model, 'uploadedFile');
$doc->save(); // extracts text and stores embeddings automatically
// 2) Update document content and embeddings
$doc = DocumentModel::findOne($id);
$doc->title = 'Updated Title';
$doc->uploadedFile = UploadedFile::getInstance($model, 'uploadedFile'); // optional
$doc->save(); // embeddings updated automatically
// 3) Delete document and corresponding vectors
$doc = DocumentModel::findOne($id);
$doc->delete(); // deletes vectors in vector DB automatically
- beforeSave() – Converts PDFs to text or reads uploaded TXT/DOCX.
- afterSave() – Generates embeddings using VectorizerHelper and stores them in the vector database.
- afterDelete() – Removes corresponding vectors from the vector database.
This makes your document storage fully RAG-ready, automatically connecting your database records with vector embeddings.
Implement the VectorDbInterface
to use any vector database. Example Qdrant adapter:
use strtob\yii2Ollama\VectorDbInterface;
use strtob\yii2Ollama\QdrantAdapter;
$qdrantAdapter = new QdrantAdapter($qdrantClient, 'my_collection');
OllamaComponent will automatically prepend top-K context from the vector DB to the prompt.
llama2
mistral
gemma
MIT License – see LICENSE