This repository has been archived by the owner on Nov 17, 2023. It is now read-only.
Replies: 1 comment
-
Hi @Arhamna. What's likely happening is that some of the variables/arrays are shared between these two models, while each NDArray only supports one entry for autograd tracing. As a result, the gradient tape in the first backward is overwritten by the second. Sorry that the error message is unclear and it would be really helpful if you could create a bug report for it. A workaround I'd recommend is adding the losses together and do backward together. Since derivative is a linear function you can get the same result as doing two backward steps separately. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I need to call backward function for two different models, separately.
Loss_1
Loss_2
I get this error on the second call of the backward function:
Check failed: type_ != nullptr: The any container is empty
Does anyone know how to fix this?
My MXNET version is 1.6.0
My issue is quite similar to this discussion #9712, except that I need to treat both losses separately.
Beta Was this translation helpful? Give feedback.
All reactions