Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bar_mu computation is different from the paper in Eq (5) #55

Open
superaha opened this issue Nov 18, 2022 · 2 comments
Open

bar_mu computation is different from the paper in Eq (5) #55

superaha opened this issue Nov 18, 2022 · 2 comments

Comments

@superaha
Copy link

Hi there,

It seems the bar_mu computation is different. Should be multiplying a "-1". (below Eqn (5), bar_mu = (gt - mu_pred) / sigma.

As shown here:

https://github.com/Jeff-sjtu/res-loglikelihood-regression/blob/203dc3195ee5a11ed6f47c066ffdb83247511359/rlepose/models/regression_nf.py#L134

This does not affect the computation of log_Q, which basically using the abs of this term. How about the flow model?
Not sure if this leads to any difference in the learning of the flow model RealNVP, or did i miss something here?

Thanks.

@Jackqu
Copy link

Jackqu commented Dec 2, 2022

I have the same question,have you found the answer?

@superaha
Copy link
Author

superaha commented Dec 2, 2022

Not really. I also post the same question in mmpose repo.
open-mmlab/mmpose#1825 (comment)

The authors mentioned the pull request when integrating RLE loss. However, the pull request is discussing the other bug on sigma, not on bar_mu.
The bug on sigma is fixed in mmpose. This repo keeps the old version.

For bar_mu, I run two experiments using mmpose. I got similar results. It seems that the sign of bar_mu have no impact on the final result.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants