Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tune parameters to get best results. #31

Open
Tekno-H opened this issue Nov 16, 2022 · 2 comments
Open

Tune parameters to get best results. #31

Tekno-H opened this issue Nov 16, 2022 · 2 comments

Comments

@Tekno-H
Copy link
Contributor

Tekno-H commented Nov 16, 2022

Hello Denis,
I have been studying and adjusting this great repository to fit my needs in defect detection.
I have trained several models successfully(using the mobilenetv3_large backbone) and achieved good results. However, I have eliminated some parts of your code, including the snippets that calculate the "seg_threshold" parameter from the ground truth.
Therefore, I am choosing it by hand (through trial&error), and although the results are OK, I think they can be further improved.

My questions are:
1- How can I choose a value for "seg_threshold" reliably in my given case?
2- What parameters do you recommend fine-tuning when training a new model (knowing that I have a good and balanced dataset of the same product but with different colours)?
3- My last question is related to exporting the model to "onnx" format, do you have any comments on how to achieve that? Do you plan on adding that capability?
4- When should I stop training ?

Thank you in advance, your work is truly inspiring.

@gudovskiy
Copy link
Owner

  1. Could you take some part of you train data to calculate optimal threshold? or do cross-validation?
  2. I didn't try ONNX conversion. Did you look at https://github.com/openvinotoolkit/anomalib ?
  3. Overfitting can happen when training for a long time. May be it is related to where is checkpoint? pretrained weights to reproduce your results? #1, where you can select a subset of your data for validation to avoid overfitting

@Tekno-H
Copy link
Contributor Author

Tekno-H commented Nov 24, 2022

Thank you for your reply,

  1. Could you take some part of you train data to calculate optimal threshold? or do cross-validation?

This is exacly what I ended up doing

2. I didn't try ONNX conversion. Did you look at https://github.com/openvinotoolkit/anomalib ?

I checked it and replied on the other thread

3. Overfitting can happen when training for a long time. May be it is related to [where is checkpoint? pretrained weights to reproduce your results? #1](https://github.com/gudovskiy/cflow-ad/issues/1), where you can select a subset of your data for validation to avoid overfitting

Thanks I will check it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants