Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reduce_sum和softmax有bug? #19

Open
zhangyafeikimi opened this issue Apr 7, 2019 · 2 comments
Open

reduce_sum和softmax有bug? #19

zhangyafeikimi opened this issue Apr 7, 2019 · 2 comments

Comments

@zhangyafeikimi
Copy link

            self.attention_relu = tf.reduce_sum(tf.multiply(self.weights['attention_p'], tf.nn.relu(self.attention_mul + \
                self.weights['attention_b'])), 2, keep_dims=True) # None * (M'*(M'-1)) * 1
            self.attention_out = tf.nn.softmax(self.attention_relu)

我认为keep_dims应该为False.

keep_dims如果为True, attention_relu的shape是(batch, m*(m-1)/2, 1)
接下来的softmax会在attention_relu的最后一个axis上做归一化, 这样是错误的.

@zhangyafeikimi zhangyafeikimi changed the title reduce_sum和之后的softmax有bug? reduce_sum和softmax有bug? Apr 7, 2019
@zhangyafeikimi
Copy link
Author

单纯改成False也不行, 后面的计算也要相应的修改

@zhangyafeikimi
Copy link
Author

this repo is dead?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant