Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High RAM consumption for fah dataset preparation #373

Open
xinaxu opened this issue Oct 7, 2023 · 0 comments
Open

High RAM consumption for fah dataset preparation #373

xinaxu opened this issue Oct 7, 2023 · 0 comments
Labels
bug Something isn't working triage

Comments

@xinaxu
Copy link
Contributor

xinaxu commented Oct 7, 2023

Had a prep with input s3 bucket fah-public-data-covid19-antibodies and output to a local directory. Started a singularity run dataset-worker --concurrency 8
After starting again after the crash, I saw: memory growing much faster for the already scanned part, steadily till around 16GB. Took 2h 1st time, and 1h-1h30 second time, then was staying around that amount of memory, moving slightly up and down (+/- 2GB).
Note: only monitoring global memory usage for how the memory progress. Not having exact graph of singularity itself, but is the only process really doing much anyway.

Version: singularity v0.5.0-427d34e

@github-project-automation github-project-automation bot moved this to 🍇 Backlog in ActionArena Oct 7, 2023
@xinaxu xinaxu added bug Something isn't working triage labels Oct 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage
Projects
Status: 🍇 Backlog
Development

No branches or pull requests

1 participant