Efficient Diffusion Models and Speech Enhancement

The field of diffusion models is moving towards improving efficiency and reducing computational costs. Researchers are exploring novel approaches to accelerate sampling, such as shared sampling schemes, cascaded generation, and discrete-time processes. These innovations have led to significant improvements in sampling speed and quality, making diffusion models more practical for real-world applications. Notably, the use of autoguidance, low-resolution conditioning, and semantic-aware sampling has shown promising results in reducing sampling costs and improving generation quality. In the area of speech enhancement, one-step generative modeling and learnable sampler distillation have emerged as effective techniques for reducing inference latency and improving speech quality. Overall, the field is witnessing a shift towards more efficient, effective, and scalable diffusion models and speech enhancement methods. Noteworthy papers include: LowDiff, which proposes a novel diffusion framework for efficient sampling with low-resolution conditioning. SAGE, which introduces a semantic-aware shared sampling framework for efficient diffusion. ArtiFree, which systematically studies artifact prediction and reduction in diffusion-based speech enhancement. Learnable Sampler Distillation, which proposes a novel approach to train fast and high-fidelity samplers for discrete diffusion models.

Sources

Autoguided Online Data Curation for Diffusion Model Training

LowDiff: Efficient Diffusion Sampling with Low-Resolution Condition

From Independence to Interaction: Speaker-Aware Simulation of Multi-Speaker Conversational Timing

SAGE: Semantic-Aware Shared Sampling for Efficient Diffusion

Compose Yourself: Average-Velocity Flow Matching for One-Step Speech Enhancement

Discrete-time diffusion-like models for speech synthesis

ArtiFree: Detecting and Reducing Generative Artifacts in Diffusion-based Speech Enhancement

Learnable Sampler Distillation for Discrete Diffusion Models

Built with on top of