Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add finetune function #68

Closed
wants to merge 14 commits into from
Closed

Add finetune function #68

wants to merge 14 commits into from

Conversation

Hanyi11
Copy link
Contributor

@Hanyi11 Hanyi11 commented May 25, 2024

No description provided.

@codecov-commenter
Copy link

codecov-commenter commented May 25, 2024

⚠️ Please install the 'codecov app svg image' to ensure uploads and comments are reliably processed by Codecov.

Codecov Report

Attention: Patch coverage is 0% with 60 lines in your changes missing coverage. Please review.

Project coverage is 0.00%. Comparing base (bea5ae0) to head (1a6e349).
Report is 12 commits behind head on main.

Files with missing lines Patch % Lines
.../membrain_seg/segmentation/training/optim_utils.py 0.00% 27 Missing ⚠️
src/membrain_seg/segmentation/finetune.py 0.00% 23 Missing ⚠️
src/membrain_seg/segmentation/cli/fine_tune_cli.py 0.00% 6 Missing ⚠️
src/membrain_seg/segmentation/cli/__init__.py 0.00% 1 Missing ⚠️
src/membrain_seg/segmentation/cli/ske_cli.py 0.00% 1 Missing ⚠️
src/membrain_seg/segmentation/skeletonize.py 0.00% 1 Missing ⚠️
src/membrain_seg/segmentation/train.py 0.00% 1 Missing ⚠️

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@          Coverage Diff           @@
##            main     #68    +/-   ##
======================================
  Coverage   0.00%   0.00%            
======================================
  Files         40      48     +8     
  Lines       1411    1727   +316     
======================================
- Misses      1411    1727   +316     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.



@cli.command(name="finetune", no_args_is_help=True)
def finetune(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good!
Can we have a finetune and finetune_advanced similar to training options, though?

I think it would be nice to not have the overwhelming choice of parameters for standard users, but instead have most standard parameters set by default.

verbose=True, # Print a message when a checkpoint is saved
)

class ToleranceCallback(Callback):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we put this class into a separate file? Does it fit to optim_utils?

# Monitor learning rate changes
lr_monitor = LearningRateMonitor(logging_interval="epoch", log_momentum=True)

class PrintLearningRate(Callback):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also put this into another file?

@LorenzLamm LorenzLamm closed this Sep 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants