Skip to content

Latest commit

 

History

History
38 lines (27 loc) · 3.86 KB

README.md

File metadata and controls

38 lines (27 loc) · 3.86 KB

Awesome-Diffusion-Distillation Awesome

This repo collects various distillation methods for the Diffusion model. Welcome to PR the works (papers, repositories) missed by the repo.

Contents

Papers

2023

  • [ICML]On Architectural Compression of Text-to-Image Diffusion [code]
  • [ICML]Consistency Models [code]
  • [ICML]Accelerating Diffusion-based Combinatorial Optimization Solvers by Progressive [code]
  • [ICML]Towards Safe Self-Distillation of Internet-Scale Text-to-Image Diffusion [code]
  • [ICME]Accelerating Diffusion Sampling with Classifier-based Feature [code]
  • [CVPR]On Distillation of Guided Diffusion [code]
  • [NeurIPs]SnapFusion: Text-to-Image Diffusion Model on Mobile Devices within Two Seconds
  • [NeurIPs]Diff-Instruct: A Universal Approach for Transferring Knowledge From Pre-trained Diffusion Models
  • [PMLR]Fast Sampling of Diffusion Models via Operator [code]
  • [arxiv]BOOT: Data-free Distillation of Denoising Diffusion Models with Bootstrapping
  • [arxiv]TRACT: Denoising Diffusion Models with Transitive Closure Time-Distillation
  • [arxiv]Catch-Up Distillation: You Only Need to Train Once for Accelerating [code]
  • [arxiv]Improved Techniques for Training Consistency [code]
  • [arxiv]Adversarial Diffusion [code]

2022

  • [ICLR]Progressive Distillation for Fast Sampling of Diffusion [code]

2021

  • [arxiv]Knowledge Distillation in Iterative Generative Models for Improved Sampling [code]