Skip to content

Commit c352247

Browse files
committed
feat: add memory_chroma example for ChromaMemoryService
1 parent ed2accc commit c352247

File tree

4 files changed

+264
-0
lines changed

4 files changed

+264
-0
lines changed
Lines changed: 65 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,65 @@
1+
# ChromaDB Memory Service Example
2+
3+
This example demonstrates using `ChromaMemoryService` for semantic memory search
4+
with embeddings generated by Ollama.
5+
6+
## Prerequisites
7+
8+
1. **Ollama Server Running**
9+
```bash
10+
ollama serve
11+
```
12+
13+
2. **Embedding Model Pulled**
14+
```bash
15+
ollama pull nomic-embed-text
16+
```
17+
18+
3. **Dependencies Installed**
19+
```bash
20+
pip install chromadb
21+
# Or with uv:
22+
uv pip install chromadb
23+
```
24+
25+
## Running the Example
26+
27+
```bash
28+
cd contributing/samples/memory_chroma
29+
python main.py
30+
```
31+
32+
## What This Demo Does
33+
34+
1. **Session 1**: Creates memories by having a conversation with the agent
35+
- User introduces themselves as "Jack"
36+
- User mentions they like badminton
37+
- User mentions what they ate recently
38+
39+
2. **Memory Storage**: The session is saved to ChromaDB with semantic embeddings
40+
- Data persists to `./chroma_db` directory
41+
- Embeddings are generated using Ollama's `nomic-embed-text` model
42+
43+
3. **Session 2**: Queries the memories using semantic search
44+
- User asks about their hobbies (agent should recall "badminton")
45+
- User asks about what they ate (agent should recall "burger")
46+
47+
## Key Differences from InMemoryMemoryService
48+
49+
| Feature | InMemory | ChromaDB |
50+
|---------|----------|----------|
51+
| Search Type | Keyword matching | **Semantic similarity** |
52+
| Persistence | No (lost on restart) | **Yes (disk)** |
53+
| Synonyms | No | **Yes** |
54+
| Performance | Fast | Fast (with HNSW index) |
55+
56+
## Customization
57+
58+
You can change the embedding model by modifying the `OllamaEmbeddingProvider`:
59+
60+
```python
61+
embedding_provider = OllamaEmbeddingProvider(
62+
model="mxbai-embed-large", # Higher quality but slower
63+
host="http://remote-server:11434", # Remote Ollama server
64+
)
65+
```
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# Copyright 2025 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
"""Sample package for ChromaMemoryService demonstration."""
Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
# Copyright 2025 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
"""Agent definition for ChromaMemoryService demo."""
16+
17+
from datetime import datetime
18+
19+
from google.adk import Agent
20+
from google.adk.agents.callback_context import CallbackContext
21+
from google.adk.tools.load_memory_tool import load_memory_tool
22+
from google.adk.tools.preload_memory_tool import preload_memory_tool
23+
24+
25+
def update_current_time(callback_context: CallbackContext):
26+
callback_context.state["_time"] = datetime.now().isoformat()
27+
28+
29+
root_agent = Agent(
30+
model="gemini-2.0-flash-001",
31+
name="chroma_memory_agent",
32+
description="Agent with ChromaDB-backed semantic memory.",
33+
before_agent_callback=update_current_time,
34+
instruction="""\
35+
You are an agent that helps users answer questions.
36+
You have access to a semantic memory system that stores past conversations.
37+
Use the memory tools to recall information from previous sessions.
38+
39+
Current time: {_time}
40+
""",
41+
tools=[
42+
load_memory_tool,
43+
preload_memory_tool,
44+
],
45+
)
Lines changed: 139 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,139 @@
1+
# Copyright 2025 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
"""Demo script for ChromaMemoryService with OllamaEmbeddingProvider.
16+
17+
This example demonstrates using ChromaDB for semantic memory search
18+
with embeddings generated by Ollama.
19+
20+
Prerequisites:
21+
1. Ollama server running: `ollama serve`
22+
2. Embedding model pulled: `ollama pull nomic-embed-text`
23+
3. Dependencies installed: `pip install chromadb`
24+
25+
Usage:
26+
python main.py
27+
"""
28+
29+
import asyncio
30+
from datetime import datetime
31+
from datetime import timedelta
32+
from typing import cast
33+
34+
import agent
35+
from dotenv import load_dotenv
36+
from google.adk.cli.utils import logs
37+
from google.adk.memory import ChromaMemoryService
38+
from google.adk.memory import OllamaEmbeddingProvider
39+
from google.adk.runners import InMemoryRunner
40+
from google.adk.sessions.session import Session
41+
from google.genai import types
42+
43+
load_dotenv(override=True)
44+
logs.log_to_tmp_folder()
45+
46+
47+
async def main():
48+
app_name = "my_app"
49+
user_id_1 = "user1"
50+
51+
# Initialize the ChromaMemoryService with Ollama embeddings
52+
embedding_provider = OllamaEmbeddingProvider(
53+
model="nomic-embed-text", # Or another embedding model you have
54+
)
55+
memory_service = ChromaMemoryService(
56+
embedding_provider=embedding_provider,
57+
collection_name="demo_memory",
58+
persist_directory="./chroma_db", # Persist to disk
59+
)
60+
61+
runner = InMemoryRunner(
62+
app_name=app_name,
63+
agent=agent.root_agent,
64+
memory_service=memory_service,
65+
)
66+
67+
async def run_prompt(session: Session, new_message: str) -> Session:
68+
content = types.Content(
69+
role="user", parts=[types.Part.from_text(text=new_message)]
70+
)
71+
print("** User says:", content.model_dump(exclude_none=True))
72+
async for event in runner.run_async(
73+
user_id=user_id_1,
74+
session_id=session.id,
75+
new_message=content,
76+
):
77+
if not event.content or not event.content.parts:
78+
continue
79+
if event.content.parts[0].text:
80+
print(f"** {event.author}: {event.content.parts[0].text}")
81+
elif event.content.parts[0].function_call:
82+
print(
83+
f"** {event.author}: fc /"
84+
f" {event.content.parts[0].function_call.name} /"
85+
f" {event.content.parts[0].function_call.args}\n"
86+
)
87+
elif event.content.parts[0].function_response:
88+
print(
89+
f"** {event.author}: fr /"
90+
f" {event.content.parts[0].function_response.name} /"
91+
f" {event.content.parts[0].function_response.response}\n"
92+
)
93+
94+
return cast(
95+
Session,
96+
await runner.session_service.get_session(
97+
app_name=app_name, user_id=user_id_1, session_id=session.id
98+
),
99+
)
100+
101+
# Session 1: Create memories
102+
session_1 = await runner.session_service.create_session(
103+
app_name=app_name, user_id=user_id_1
104+
)
105+
106+
print(f"----Session to create memory: {session_1.id} ----------------------")
107+
session_1 = await run_prompt(session_1, "Hi")
108+
session_1 = await run_prompt(session_1, "My name is Jack")
109+
session_1 = await run_prompt(session_1, "I like badminton.")
110+
session_1 = await run_prompt(
111+
session_1,
112+
f"I ate a burger on {(datetime.now() - timedelta(days=1)).date()}.",
113+
)
114+
session_1 = await run_prompt(
115+
session_1,
116+
f"I ate a banana on {(datetime.now() - timedelta(days=2)).date()}.",
117+
)
118+
119+
print("Saving session to ChromaDB memory service...")
120+
await memory_service.add_session_to_memory(session_1)
121+
print("Session saved! Data persisted to ./chroma_db")
122+
print("-------------------------------------------------------------------")
123+
124+
# Session 2: Query memories using semantic search
125+
session_2 = await runner.session_service.create_session(
126+
app_name=app_name, user_id=user_id_1
127+
)
128+
print(f"----Session to use memory: {session_2.id} ----------------------")
129+
session_2 = await run_prompt(session_2, "Hi")
130+
session_2 = await run_prompt(session_2, "What do I like to do?")
131+
# Expected: The agent should recall "badminton" from semantic search
132+
session_2 = await run_prompt(session_2, "When did I say that?")
133+
session_2 = await run_prompt(session_2, "What did I eat yesterday?")
134+
# Expected: The agent should recall "burger" from semantic search
135+
print("-------------------------------------------------------------------")
136+
137+
138+
if __name__ == "__main__":
139+
asyncio.run(main())

0 commit comments

Comments
 (0)