You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running experiments.evaluate on ROME on gpt2-xl, I get an OOM after 4 cases on an 11 GB RTX 2080 Ti GPU. Given that each case runs sequentially, is this expected? If it is, do you think there are any trivial ways to extend evaluation on multiple GPUs?
Thanks and great work!
The text was updated successfully, but these errors were encountered:
yashjakhotiya
changed the title
CUDA out of memory error on 11 GB GPUs: Any easy ways to use multiple GPUs?
CUDA out of memory error on an 11 GB GPU: Any easy ways to use multiple GPUs?
Sep 21, 2022
Hi,
When running experiments.evaluate on ROME on gpt2-xl, I get an OOM after 4 cases on an 11 GB RTX 2080 Ti GPU. Given that each case runs sequentially, is this expected? If it is, do you think there are any trivial ways to extend evaluation on multiple GPUs?
Thanks and great work!
The text was updated successfully, but these errors were encountered: