Skip to content

Add distillation script for faster inference #54

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 45 commits into from
Apr 16, 2025
Merged

Conversation

Flova
Copy link
Member

@Flova Flova commented Jan 28, 2025

Proposed changes

This adds an experimental script that is able to distill a diffusion model into one that only does a single step.

Checklist

  • Write documentation
  • Create issues for future work
  • This PR is on our DDLitLab project board

@Flova Flova changed the title Add destillation script for faster inference Add distillation script for faster inference Jan 30, 2025
@@ -147,7 +177,7 @@
)

# Load the learning rate scheduler state if a checkpoint is provided
if args.checkpoint is not None:
if args.checkpoint is not None and False:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will always result in False.
I assume you meant if args.checkpoint is not None and args.checkpoint is not False in which case I would just do if args.checkpoint:

But I don't really see when it could be False anyway, since it is defined as a str argument.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was just a hack to deactivate this code path. I will clean this up with another flag. Because loading the learning rate scheduler state when starting from a pretrained model is not desirable (as the end of the schedule was reached during pretrained). In contrast to that you want to resume the schedule if e.g. the training was interrupted.

@Flova Flova merged commit 71e8c70 into main Apr 16, 2025
4 of 5 checks passed
@Flova Flova deleted the feature/destillation branch April 16, 2025 18:26
@github-project-automation github-project-automation bot moved this from In progress to Done in DDLitLab Apr 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

2 participants