You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, when i use the instruction python synthesize_fuse.py -S data/may -M output/may_talkingface --use_train --audio data/may/aud.npy, the error comes: slurmstepd: error: Detected 1 oom-kill event(s) in StepId=3618241.batch cgroup. Some of your processes may have been killed by the cgroup out-of-memory handler.
So i am wondering how much memory should i have to work it. And how large the batch_size is? Thank you.
The text was updated successfully, but these errors were encountered:
Hi, it should take memory less than the training process. If you successfully run the training, you can run the final inference as well with the same device. To fully load the data into the memory, it takes about 60GB data for the sample of May. However, most of these data are not in use in synthesize_fuse.py. You can manually code the dataloader to prune them.
The text was updated successfully, but these errors were encountered: