-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Questions about WAP #19
Comments
sorry for the late.
|
Hi, thanks for the response I tried training another WAP implementation (https://github.com/menglin0320/wap-on-tensor) on the new 2019 crohme dataset (https://www.cs.rit.edu/~crohme2019/task.html) but couldn't get the model to generalize to the new out of distribution validation set. Did you also experience problems with your model generalizing to the 2013 test set? If so, how were you able to solve that problem? Thanks |
Recently, we also take part in the CROHME2019 competition. Our model can be tested on CROHME 2013, 2014, 2016 dataset. So, I didn't meet your problems. |
@JianshuZhang |
Yes, the encoder depends on your task, it would be easy to change the encoder part.
For densenet on CROHME dataset, less than 40k will be OK.
… 在 2019年8月28日,上午10:57,Zhang ***@***.***> 写道:
@JianshuZhang <https://github.com/JianshuZhang>
1.densenet is best for CROHM dataset, if your training set is big enough, maybe resnet is better.
how many pics will satisfy the ' big enough '? 20k will ok?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub <#19?email_source=notifications&email_token=AETNJULEWNAHHXUKJZQESNTQGXSS3A5CNFSM4HCZEIX2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD5JXKVY#issuecomment-525563223>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AETNJUNGYUZB5XX2VSOHGY3QGXSS3ANCNFSM4HCZEIXQ>.
|
thank you for your reply. if I want to change the backbone to resnet, pics num should be more than 40k.Is my understanding right? |
Hi,
I was reimplementing WAP and had some questions.
Thanks
The text was updated successfully, but these errors were encountered: