Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question about mask on feature map #24

Open
tnkong opened this issue Sep 18, 2019 · 1 comment
Open

question about mask on feature map #24

tnkong opened this issue Sep 18, 2019 · 1 comment

Comments

@tnkong
Copy link

tnkong commented Sep 18, 2019

您好张老师:
在feature map 上面打掩码是必须的吗? 我看到有些模型没有这么做? 所以想问问您对此的理解和看法.
从代码上来看, 在计算attention权重的时候对图片pad的几何位置打了mask,这样在最终计算注意力权重的时候,被mask的位置被人为的抑制, 我想您可能是想使用一种局部的注意力机制,不是global的计算而是局部local的方式去计算注意力权重。我对这个的mask作用有些疑问,请问您是否做过对比实验, 一个有mask, 一个没有mask, 模型性能如何??

@lai-agent-m
Copy link

这个mask是为了除掉padding的影响,希望model有padding和没padding一样。不是所有的work都这么搞,我看到有的paper就把图片居中然后就不管padding了。张大佬没有normalize height但有的work normalize了.Handwritten Mathematical Expression Recognition via Paired
Adversarial Learning 这篇就没怎么管padding还normalize了height.

其实我觉得张大佬搞batch的code粗糙了些。。这个东西优化下可能能提分。但这种提分对出paper也没啥用

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants