Advances in Generative Models and Knowledge Distillation

The field of generative models and knowledge distillation is rapidly evolving, with a focus on improving the efficiency, scalability, and performance of these models. Recent developments have centered around enhancing the training of flow-based generative models, exploring the role of dataset size in knowledge distillation, and proposing novel approaches to improve the distillation process. Notably, researchers have introduced new methods to optimize the training of generative models, such as using semi-discrete optimal transport and adaptive discretization. Additionally, there has been a surge of interest in applying knowledge distillation to large language models and diffusion models, with techniques like alpha-mixture assistant distribution and bidirectional concept distillation showing promising results. These advancements have the potential to significantly impact various applications, including image and text generation, and dataset distillation.

Some noteworthy papers in this area include: AlignFlow, which introduces a novel approach to enhance the training of flow-based generative models using semi-discrete optimal transport. AMiD, which proposes a unified framework for knowledge distillation using alpha-mixture assistant distribution, demonstrating superior performance and training stability. GuideFlow3D, which presents a principled approach to appearance transfer using optimization-guided rectified flow, achieving high-quality results and outperforming baselines.

Sources

AlignFlow: Improving Flow-based Generative Models with Semi-Discrete Optimal Transport

Revisiting Knowledge Distillation: The Hidden Role of Dataset Size

AMiD: Knowledge Distillation for LLMs with $\alpha$-mixture Assistant Distribution

GuideFlow3D: Optimization-Guided Rectified Flow For Appearance Transfer

One-step Diffusion Models with Bregman Density Ratio Matching

Diffusion Models as Dataset Distillation Priors

Adaptive Discretization for Consistency Models

GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver

Knowledge Distillation of Uncertainty using Deep Latent Factor Model

Optimization Benchmark for Diffusion Models on Dynamical Systems

EchoDistill: Bidirectional Concept Distillation for One-Step Diffusion Personalization

AlphaFlow: Understanding and Improving MeanFlow Models

Built with on top of