Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: cache keys server response #240

Merged
merged 23 commits into from
Dec 11, 2023
Merged

Conversation

chris13524
Copy link
Member

@chris13524 chris13524 commented Dec 8, 2023

Description

Keys Server requests take up 800ms-1.5s of messaging processing time according to Grafana metrics. This creates bad UX as they are expecting a response.

Depends on #231

How Has This Been Tested?

Existing tests

Due Diligence

  • Breaking change
  • Requires a documentation update
  • Requires a e2e/integration test update

@chris13524 chris13524 self-assigned this Dec 8, 2023
Copy link
Contributor

@geekbrother geekbrother left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

if let Some(redis) = redis {
let cacao = cacao.clone();
let redis = redis.clone();
let cache_ttl = chrono::Duration::weeks(1)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it ok to have such a huge TTL for a cache?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Base automatically changed from feat/rate-limiting to main December 11, 2023 15:15
@chris13524 chris13524 merged commit 2f96031 into main Dec 11, 2023
16 checks passed
@chris13524 chris13524 deleted the fix/cache-keys-server-response branch December 11, 2023 18:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants