Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Prompt Caching with Vertex #653

Open
willriley opened this issue Sep 7, 2024 · 3 comments
Open

Support Prompt Caching with Vertex #653

willriley opened this issue Sep 7, 2024 · 3 comments

Comments

@willriley
Copy link

To my knowledge, prompt caching isn't supported when using Claude on Vertex, either via the messages API or the SDKs. Is there any ETA on when that will be added?

@jennmueng
Copy link

We use Claude through vertex ai too, keeping an eye on this.

@ggdupont
Copy link

ggdupont commented Oct 9, 2024

That's a bummer for us and we might need to switch model if this is not supported soon.

@codenprogressive
Copy link

+1
Any updates on enabling context caching for Claude available through VertexAI?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants