Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The paper and the code does not match #2

Open
LyazS opened this issue Mar 12, 2020 · 3 comments
Open

The paper and the code does not match #2

LyazS opened this issue Mar 12, 2020 · 3 comments

Comments

@LyazS
Copy link

LyazS commented Mar 12, 2020

The OIM in paper use variable alpha and the IR loss in paper use beta
but in code,
alpha = np.array([float(os.getenv('alpha', None))])
beta = np.array([float(os.getenv('beta', None))])
and nowhere to find to set the variable. Maybe they are not the same thing?

And the IR loss in paper is that: L=1/P * w * y * (1+z) * logx
but in the code ./OIM/caffe-oim/src/caffe/layers/weighted_softmax_loss_layer.cu
it run in another way.

Could u please explain it or release the loss in pytorch?

@Yinyf0804
Copy link

Yinyf0804 commented Apr 10, 2020

Maybe alpha, beta are got from train_oim.sh, and they both have the same meaning with beta in paper. The ratio in code means alpha in paper.

I have a question in ./OIM/caffe-oim/src/caffe/layers/weighted_softmax_loss_layer.cu too. Why the code uses (-beta) in line 97 but uses (1.0 + alpha) in line 104. Besides, I'm not understand why the condition in line94 can suppress background proposals.

I will be grateful if you can give me some advice.

@chenhaolin1989
Copy link
Collaborator

chenhaolin1989 commented Jun 24, 2020

Yes, the naming of parameters in the code have a small different with those in the paper.
ratio (in the code) = alpha (in the paper). alpha,beta (in the code) = beta (in the paper, one for surrounding proposal and one for center proposal).
Thanks for pointing out the -beta problem. It should be beta.
We suppress the center proposal by making the gradients of surrounding proposals scale up to (1 + beta) of its original value, and gradient of the center proposal scale to beta.

@LyazS
Copy link
Author

LyazS commented Jun 24, 2020

Yes, the naming of parameters in the code have a small different with those in the paper.

Could u please point it out clearly?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants