The field of generative modeling is rapidly advancing, with a focus on improving the efficiency, stability, and quality of generated samples. Recent developments have seen the integration of optimal transport theory, electrostatic models, and flow matching techniques to enhance the performance of generative models. Notably, the use of idempotent generative networks, score-based distillation, and mean flow models has led to significant improvements in sample quality and inference speed. Furthermore, research has explored the application of generative models to discrete data generation, long-tailed distributions, and Riemannian manifolds, demonstrating the versatility and potential of these models.
Some noteworthy papers in this area include: The paper on Score-based Idempotent Distillation of Diffusion Models, which proposes a novel method for distilling idempotent models from diffusion models, enabling faster inference and state-of-the-art results on image datasets. The paper on Overclocking Electrostatic Generative Models, which introduces a distillation framework for accelerating electrostatic generative models, achieving near-teacher or superior sample quality with reduced computational cost. The paper on Riemannian Consistency Model, which proposes a consistency model for Riemannian manifolds, enabling few-step generation and superior generative quality on non-Euclidean manifolds.