Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Redis semantic cache feature #758

Closed
lordofthejars opened this issue Jul 18, 2024 · 4 comments
Closed

Add Redis semantic cache feature #758

lordofthejars opened this issue Jul 18, 2024 · 4 comments
Labels
duplicate This issue or pull request already exists enhancement New feature or request

Comments

@lordofthejars
Copy link
Contributor

One of the things it is going to be interesting to have out of the box about AI is the semantic cache of requests.

Actually, it could be used in any method but according to a recent study, 31% of queries to LLM can be cached (or, in other words, 31% of the queries are contextually repeatable), which can significantly improve response time in GenAI apps.

I created a simple example that implements this with Redis: https://github.com/lordofthejars-ai/quarkus-langchain-examples/tree/main/semantic-cache

Do you think it might be interesting to integrate this into Quarkus Cache system for example as Redis-semantic-cache or something like this?

@geoand
Copy link
Collaborator

geoand commented Jul 18, 2024

I think @iocanel and @andreadimaio were thinking of something similar

@andreadimaio
Copy link
Collaborator

andreadimaio commented Jul 18, 2024

Yes, what we have here #659 is a concept of semantic cache.
The idea is to have something very similar to ChatMemory, so you can extend the default implementation (in-memory) with other products like Redis or something else.

@lordofthejars
Copy link
Contributor Author

Great, feel free to take a view in my example, you'll see that code to do it is not complex, some configuration parameters that's true. Calculating Keys is easy as by default Quarkus cache offers the interface to override the creation of keys. The problem is in the code to check if it is a cache miss or not.

@geoand
Copy link
Collaborator

geoand commented Jul 18, 2024

Thanks for the input.

Closing as duplicate in light of the conversation above.

@geoand geoand closed this as not planned Won't fix, can't repro, duplicate, stale Jul 18, 2024
@geoand geoand added duplicate This issue or pull request already exists enhancement New feature or request labels Jul 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
duplicate This issue or pull request already exists enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants