-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About Channel-Wise Attention(CWA) in the code. #5
Comments
dot([0,1,0].t(), [1,2,3].t()) = [0,2,0].t() |
Got it.Thanks for the response. |
I had a small question about the code, here |
Yes. Please see Section III.A The Discriminality Component for details. |
Great work!But I have little question about CWA.In the origianl paper, I see
M_i = diag(Mask_i)
, wherediag
is putting a vector on the princial diagonal of a diagonal matrix.But in the code below:I think
bar
is not a diagonal matrix. Please point out my problem if I misunderstood the operation here.Thanks a lot.The text was updated successfully, but these errors were encountered: