Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement caching on Queries #8

Open
CesarD opened this issue Nov 7, 2022 · 0 comments
Open

Implement caching on Queries #8

CesarD opened this issue Nov 7, 2022 · 0 comments

Comments

@CesarD
Copy link
Collaborator

CesarD commented Nov 7, 2022

Detailed Description

Implement Output caching for selected endpoints and provide mechanisms for invalidating such cache.
The middleware should implement a Memory cache as L1 and a Redis cache as L2 in order to allow distributed caching for multiple instances of a service.

Context

In order to improve performance and reliability, sometimes it's recommended to cache the content that APIs are serving to reduce roundtrips to database or serve responses even if the service is down.
Also, since a distributed cache sometimes might be too slow to provide a fast enough response for a cached content, it's good to have 2 levels of cache: Memory cache as L1 and, for example, a Redis implementation as L2 cache. This allows to have a faster access to data and then sync them as needed for not requiring to have permanent roundtrips to DB for reading the same data constantly.

Possible Implementation

Leverage FusionCache for implementing a backplane that implements MemoryCache as L1 and Redis as L2 cache and to keep both in sync.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants