You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, claude and gemini support prompt caching, but other LMs don't. It'd be great if there could be a supports_prompt_caching property (like supports_vision etc.) that tells us which models support prompt caching.
Motivation, pitch
In OpenHands we would like to turn prompt caching on by default, but we can't easily do so without this feature, because if we attempt to use prompt caching with an LM that doesn't support it then it throws an error.
We tried to fix this by checking for claude in the model name, but that doesn't work because claude on GCP apparently doesn't support prompt caching yet.
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
The Feature
Currently, claude and gemini support prompt caching, but other LMs don't. It'd be great if there could be a
supports_prompt_caching
property (likesupports_vision
etc.) that tells us which models support prompt caching.Motivation, pitch
In OpenHands we would like to turn prompt caching on by default, but we can't easily do so without this feature, because if we attempt to use prompt caching with an LM that doesn't support it then it throws an error.
We tried to fix this by checking for
claude
in the model name, but that doesn't work because claude on GCP apparently doesn't support prompt caching yet.Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: