-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transformers and deepspeed dependency problem #35
Comments
Hi, Thanks for your interest in our work. We have listed the recommended versions of |
hi do you have any further questions? |
Hi, Thank you for your previous guidance.
To address this, we tried upgrading to the latest versions of transformers and deepspeed, and using transformers.integrations.deepspeed. However, this requires significant changes to the official codebase, including files like trainer.py and run_pretrain.py. Is there a way to resolve this issue while keeping the official code intact? Thank you! |
First of all, thank you for your work.
We would like to utilize this excellent model in our research. Specifically, we plan to use run_pretrain.py and run_pretrain.bash to pretrain the model and then extract sequence embeddings from the protein encoder.
However, we are currently unable to proceed due to version issues with transformers and deepspeed.
We followed a solution suggested in an existing issue, which advised not to use the deepspeed.py provided by OntoProtein and instead use the user's own deepspeed.
However, problems related to the version of transformers persist. For example, issues arise with transformers.deepspeed and src.optimization's get_scheduler.
Could you provide the environment file used during model development or share the specific versions of transformers and deepspeed that were used?
Thank you.
The text was updated successfully, but these errors were encountered: