Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Something about "LossCosine" function #2

Open
Sunliangtai opened this issue Jan 18, 2021 · 1 comment
Open

Something about "LossCosine" function #2

Sunliangtai opened this issue Jan 18, 2021 · 1 comment

Comments

@Sunliangtai
Copy link

It seems that there is something wrong about your function "LossCosine", located at /Layers/TAL_pytorch.py. When I train with cosine regularization, the loss is less than zero, which is totally impossible. And I notice that in function "LossCosine", before returning the mean value of loss, you multiply it with minus one, which is different from that in function "LossDis" function.
And in your paper, the math formulation of your Cosine regularization is also multiplied by minus one, which makes the loss value is less than zero. Is there something wrong with me?

@yingrliu
Copy link
Owner

Cosine similarity ranges from -1 to 1, and the larger value means that the two vectors have closer directions. So multiplying it by -1 means that we want to maximizing the cosine similarity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants