-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Synchronize gradients in manual optimization with DDPStrategy(static_graph=True) #21251
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Synchronize gradients in manual optimization with DDPStrategy(static_graph=True) #21251
Conversation
…atic_graph=True). Ensure gradients are reduced correctly when using manual optimization and DDP with static_graph enabled.
@SkafteNicki @Borda @lantiga kindly review this pull request, thanks! |
Failures seem unrelated to the PR, can you please review the PR @Borda @SkafteNicki , thanks! |
@Sohaib-Ahmed21 yes CI is down at the moment so nothing to do at the moment |
@Borda @SkafteNicki tests are passing now. Can you please review the PR, thanks! |
What does this PR do?
Fixes gradient synchronization when using manual optimization with
DDPStrategy(static_graph=True)
.reducer._delay_all_reduce()
inpost_backward
to ensure gradients are properly reduced in the first iteration.test_ddp_gradients_synced
) that checks gradient synchronization across:automatic_optimization=True
/False
static_graph=True
/False
Fixes #18086
Before submitting
Yes
yes
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--21251.org.readthedocs.build/en/21251/