You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, we have an issue if we try to do an operation on two mpc tensors one with requires grad = True and the other with requires grad = False, and run backward to calculate the gradient.
result of the two of the above-mentioned tensors gives grad which is not the case if we do a similar operation on the torch.
Are you interested in working on this improvement yourself?
Yes, I am.
Additional Context
complete the mentioned task.
complete the todo to add the test for the same at test_backward_with_one_requires_grad in mpc_tensor_tests.py
The text was updated successfully, but these errors were encountered:
aanurraj
changed the title
improve retain_grad to have same functionality as tin torch
improve retain_grad to have same functionality as in torch
Jun 16, 2021
Description
Currently, we have an issue if we try to do an operation on two mpc tensors one with
requires grad = True
and the other withrequires grad = False
, and run backward to calculate the gradient.result of the two of the above-mentioned tensors gives grad which is not the case if we do a similar operation on the torch.
Are you interested in working on this improvement yourself?
Additional Context
test_backward_with_one_requires_grad
inmpc_tensor_tests.py
The text was updated successfully, but these errors were encountered: