Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the loss #16

Open
taylover-pei opened this issue Dec 23, 2018 · 1 comment
Open

Question about the loss #16

taylover-pei opened this issue Dec 23, 2018 · 1 comment

Comments

@taylover-pei
Copy link

You have done a great work! But I have some questions about the loss function.
I have no idea where is the advarsarial loss in the code?
You used the nn.LogSoftmax() in the adapt.py, and then used the nn.CrossEntropyLoss() in the main.py to train the discriminator. As we know, the nn.CrossEntropyLoss() combines nn.LogSoftmax() and nn.NLLLoss() in one single class, so is this use not repeated?
And other question is the nn.CrossEntropyLoss() and the nn.LogSoftmax() is equivalent to the advarsarial loss?
Thank you very much!
Looking forward to your reply!

@taylover-pei
Copy link
Author

And also how about the BCELoss?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant