A powerful, pluggable vector memory and Model Context Protocol (MCP) server for local semantic search and long-term memory.
- MCP Integration: Fully compatible with the Model Context Protocol.
- Session-Scoped & Universal Memory: Scoped tools isolate memory per
sessionId; universal tools provide shared, session-independent storage. - Pluggable Architecture: Easily swap embedding providers and vector stores.
- Multiple Storage Backends: Supports Memory, Filesystem, and ChromaDB stores out of the box.
- Semantic Search: Use state-of-the-art embeddings for intelligent memory retrieval.
- DTS Indexing: Optimized search using Distance to Samples (DTS) logic.
You can run the consciousness MCP server directly without installation using npx:
npx @one710/consciousnessBy default, this will start an MCP server named "consciousness" using a FilesystemVectorStore (persisted to ./memory_store.json) and HFEmbeddingProvider.
npm install @one710/consciousnessimport { createServer } from "@one710/consciousness";
import { MemoryVectorStore } from "@one710/consciousness/vector/memory";
import { HFEmbeddingProvider } from "@one710/consciousness/embeddings/huggingface";
const provider = new HFEmbeddingProvider();
const store = new MemoryVectorStore(provider);
const server = createServer("my-server", "1.0.0", store);
// Connect to transport (e.g., Stdio)
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
const transport = new StdioServerTransport();
await server.connect(transport);Uses @huggingface/transformers to generate embeddings locally on your CPU/GPU.
import { HFEmbeddingProvider } from "@one710/consciousness/embeddings/huggingface";
const provider = new HFEmbeddingProvider();Uses the Vercel AI SDK to connect to any supported provider (e.g., OpenAI, Anthropic, Google).
import { AISDKEmbeddingProvider } from "@one710/consciousness/embeddings/aisdk";
import { openai } from "@ai-sdk/openai";
const provider = new AISDKEmbeddingProvider(
openai.embedding("text-embedding-3-small"),
1536, // Dimensions
);import { MemoryVectorStore } from "@one710/consciousness/vector/memory";
const store = new MemoryVectorStore(provider);import { FilesystemVectorStore } from "@one710/consciousness/vector/filesystem";
const store = new FilesystemVectorStore(provider, "./memory-data.json");import { ChromaVectorStore } from "@one710/consciousness/vector/chroma";
import { ChromaClient } from "chromadb";
const client = new ChromaClient();
const store = new ChromaVectorStore(provider, client, "my-collection");All store operations require a sessionId to isolate memories:
const sessionId = "user-123";
// Store a memory
await store.add(sessionId, "The capital of France is Paris");
// Search within the session
const results = await store.search(sessionId, "France", {
method: "cosine",
limit: 5,
});
// Forget a specific memory
await store.forget(sessionId, results[0].item.id);
// Clear all memories for the session
await store.clear(sessionId);The MCP server exposes two sets of tools:
| Tool | Description |
|---|---|
add_to_scoped_memory |
Store content scoped to a session |
search_scoped_memory |
Semantic search within a session (cosine, euclidean, dts) |
forget_scoped_memory |
Remove a specific memory by ID within a session |
clear_scoped_memory |
Clear all memories for a session |
| Tool | Description |
|---|---|
add_to_universal_memory |
Store content in shared, session-independent memory |
search_universal_memory |
Semantic search across universal memory (cosine, euclidean, dts) |
forget_universal_memory |
Remove a specific memory by ID from universal memory |
clear_universal_memory |
Clear all universal memories |
This project is licensed under the MIT License.