Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache answers #28

Open
ModischFabrications opened this issue Apr 11, 2020 · 3 comments
Open

Cache answers #28

ModischFabrications opened this issue Apr 11, 2020 · 3 comments
Labels
enhancement New feature or request

Comments

@ModischFabrications
Copy link
Owner

People that are trying this service out are likely to queue the same result multiple times.
Caching these values will really save some execution time.

This is somehow relevant to #24

@ModischFabrications
Copy link
Owner Author

Measure RAM size and performance for a comparison. Caching will result in some faster responses, but slows down every request and will increase ram usage.

@ModischFabrications
Copy link
Owner Author

It would also be nice to get a "/cached" path to debug successful caching and offer "offline" solutions

Remember to add a "cached" field to response object

@ModischFabrications
Copy link
Owner Author

Problem: Uvicorn will spawn different processes in parallel, which will have difficulties keeping a shared cache. Redis and other external solutions might be nice, but have huge overhead. Might be able to cache requests in middleware before passing them to backend

@ModischFabrications ModischFabrications added the enhancement New feature or request label Apr 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant