Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Other]关于Prompt caching #1699

Open
Ericjson001 opened this issue Dec 20, 2024 · 0 comments
Open

[Other]关于Prompt caching #1699

Ericjson001 opened this issue Dec 20, 2024 · 0 comments

Comments

@Ericjson001
Copy link

在chatbox调用claude 或者 openai API 时,随着多轮对话,token会呈倍数暴增,比如上一轮对话,token消耗总数2,0000,下一轮对话(实际对话内容的token总数 input+output=2000+),token消耗总数直接到40000+。我查了一下官方的资料,通过Prompt caching来解决。希望能给到你参考,同时也希望能够得到改善。感谢你开发出这么好的应用!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant