Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The value of 'loss' is fluctuating when train to the third epoch, And it no longer decrease #40

Open
quxu91 opened this issue Jun 8, 2022 · 1 comment

Comments

@quxu91
Copy link

quxu91 commented Jun 8, 2022

Thanks for your wonderful jobs!
I try to train on the DanceTrack to realize the result as the paper, but when it runs to the third epoch, the value of 'loss' is fluctuating and it no longer decrease . I want to know if it is necessary continue to training. Can i decrease the learning rate right now.

@quxu91 quxu91 changed the title The value of 'loss' is fluctuating when train to the 3 epochs, And it no longer decrease The value of 'loss' is fluctuating when train to the third epoch, And it no longer decrease Jun 8, 2022
@Soulmate7
Copy link

Soulmate7 commented Nov 11, 2022

Hi, I get the same problem as you mentioned, and my loss is about 6 which is fluctuating between 5 and 7 ranges after several epochs and it also no longer decrease. By the way, I trained it on a 3090 with 24GB, does it make difference?

Could you tell me whether decreasing the lr is helpful or your solution If you solve the problem. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants