Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bus error in parallelformers 1.2.7 for OPT model #37

Open
sindhuvahinis opened this issue Aug 24, 2022 · 1 comment
Open

Bus error in parallelformers 1.2.7 for OPT model #37

sindhuvahinis opened this issue Aug 24, 2022 · 1 comment
Labels
bug Something isn't working

Comments

@sindhuvahinis
Copy link

How to reproduce

from transformers import AutoModelForCausalLM, AutoTokenizer

if __name__ == '__main__':
    model_name = 'facebook/opt-30b'
    model = AutoModelForCausalLM.from_pretrained(model_name)
    tokenizer = AutoTokenizer.from_pretrained(model_name)
    
    from parallelformers import parallelize
    
    parallelize(model, num_gpus=8, fp16=True)

This error was thrown at parallelize method :

Bus error (core dumped)

We tried with parallelformers version 1.2.6 and transformers version 4.21.11, this error was not thrown. This error is only happening with the parallelformers version 1.2.7 and transformers version 4.21.11.

Environment

  • OS : Ubuntu
  • Python version : 3.8.13
  • Transformers version : 4.21.11
  • Parallelformers version : 1.2.6
  • Whether to use Docker: yes
  • Misc.:
@sindhuvahinis sindhuvahinis added the bug Something isn't working label Aug 24, 2022
@agabaldon
Copy link

Same problem under:

parallelformers 1.2.7
transformers 4.24.0
python 3.8.16
pytorch 2.0.0
pytorch-cuda 11.7

The model works on 2 GPUs without parallelformers. Trying to use >2 GPUs with parallelformers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants