Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement CachedVolume #5

Open
constantinpape opened this issue Aug 8, 2019 · 2 comments
Open

Implement CachedVolume #5

constantinpape opened this issue Aug 8, 2019 · 2 comments

Comments

@constantinpape
Copy link
Owner

Which cache do we use?
LRU cache?
Need general caching logic.

@constantinpape constantinpape added this to the Finish Wrapper milestone Aug 8, 2019
@constantinpape
Copy link
Owner Author

@j6k4m8
Copy link
Contributor

j6k4m8 commented Nov 12, 2021

Yeah, agreed! I think it makes sense if adding another cache layer to stay flexible — it's possible that LRUCache makes sense for some workflows but not others. I'm imagining a "Proactive Cache" that caches the z+1 chunk when you request the zth chunk, or something...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants