Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

improve retain_grad to have same functionality as in torch #240

Open
2 tasks
aanurraj opened this issue Jun 12, 2021 · 0 comments
Open
2 tasks

improve retain_grad to have same functionality as in torch #240

aanurraj opened this issue Jun 12, 2021 · 0 comments
Labels
enhancement New feature or request
Milestone

Comments

@aanurraj
Copy link
Contributor

aanurraj commented Jun 12, 2021

Description

Currently, we have an issue if we try to do an operation on two mpc tensors one with requires grad = True and the other with requires grad = False, and run backward to calculate the gradient.

result of the two of the above-mentioned tensors gives grad which is not the case if we do a similar operation on the torch.

Are you interested in working on this improvement yourself?

  • Yes, I am.

Additional Context

  • complete the mentioned task.
  • complete the todo to add the test for the same at test_backward_with_one_requires_grad in mpc_tensor_tests.py
@aanurraj aanurraj added the enhancement New feature or request label Jun 12, 2021
@aanurraj aanurraj added this to the Training milestone Jun 12, 2021
@aanurraj aanurraj changed the title improve retain_grad to have same functionality as tin torch improve retain_grad to have same functionality as in torch Jun 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant