Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

disk caching very large items fails #313

Open
jllanfranchi opened this issue Mar 29, 2017 · 1 comment
Open

disk caching very large items fails #313

jllanfranchi opened this issue Mar 29, 2017 · 1 comment
Labels
bug enhancement Outdated Ancient issues that can be discarded

Comments

@jllanfranchi
Copy link
Contributor

Probably a limitation of sqlite3; possibly need to use filesystem. Workaround for now is to try/except to fail gracefully, and simply move on as if the item wasn't in the cache. If writing fails, it apparently can corrupt the cache, though, so other items stored successfully won't be retrievable and next time you try to write to the cache, it will fail regardless. (Also, reading if a key is present fails.)

Possible solutions would be to move to simple file storage but place a sqlite3 db to store file metadata and to implement inter-process locking. Once a lock has been obtained, then the file can be written directly to disk, presumably in the cache dir or a subdir where the db resides.

For now, this is not a blocking bug because all but somewhat unusually large items do work.

@jllanfranchi
Copy link
Contributor Author

@LeanderFischer LeanderFischer added the Outdated Ancient issues that can be discarded label Apr 24, 2024
@LeanderFischer LeanderFischer added this to the PISA 4.2 milestone Apr 24, 2024
@LeanderFischer LeanderFischer removed this from the PISA 4.2 milestone Sep 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug enhancement Outdated Ancient issues that can be discarded
Projects
None yet
Development

No branches or pull requests

2 participants