Advances in Image Restoration and Generation

The field of image restoration and generation is rapidly advancing with the development of new methods and techniques. Recently, there has been a focus on improving the fidelity and efficiency of image restoration models, with approaches such as Self-Improved Privilege Learning (SIPL) and IRBridge showing promising results. Additionally, advances in flow matching and diffusion models have led to significant improvements in image generation quality and speed. Noteworthy papers in this area include those that propose novel frameworks such as STORK, Graph Flow Matching, and Diff2Flow, which have achieved state-of-the-art results on various benchmarks. Furthermore, techniques like Contrastive Flow Matching and Aligning Latent Spaces with Flow Priors have been introduced to enhance condition separation and latent space alignment. These developments have the potential to impact a wide range of applications, from image and video generation to image restoration and editing. Some notable papers include: STORK, which achieves improved generation quality with a novel ODE solver, and Diff2Flow, which enables efficient finetuning of diffusion models for flow matching.

Sources

Boosting All-in-One Image Restoration via Self-Improved Privilege Learning

STORK: Improving the Fidelity of Mid-NFE Sampling for Diffusion and Flow Matching Models

IRBridge: Solving Image Restoration Bridge with Pre-trained Generative Diffusion Models

Graph Flow Matching: Enhancing Image Generation with Neighbor-Aware Flow Fields

Inference Acceleration of Autoregressive Normalizing Flows by Selective Jacobi Decoding

An Introduction to Flow Matching and Diffusion Models

Diff2Flow: Training Flow Matching Models via Diffusion Model Alignment

SFBD Flow: A Continuous-Optimization Framework for Training Diffusion Models with Noisy Samples

ControlMambaIR: Conditional Controls with State-Space Model for Image Restoration

Solving Inverse Problems with FLAIR

Interaction Field Matching: Overcoming Limitations of Electrostatic Models

Implicit Regularization of the Deep Inverse Prior Trained with Inertia

Negative-Guided Subject Fidelity Optimization for Zero-Shot Subject-Driven Generation

On the Closed-Form of Flow Matching: Generalization Does Not Arise from Target Stochasticity

Aligning Latent Spaces with Flow Priors

Contrastive Flow Matching

Built with on top of