Advances in Image Editing and Generation

The field of image editing and generation is moving towards more precise and controllable methods. Recent developments have focused on improving the quality and fidelity of image editing, particularly in tasks such as local editing, object removal, and texture transfer. One notable trend is the use of diffusion models, which have shown promising results in achieving high-fidelity edits and generating realistic images. Another area of research is exploring new formats of randomness and noise schedules to enhance generative diversity and quality. Noteworthy papers include: PixPerfect, which introduces a pixel-level refinement framework for seamless local editing, and Refacade, which proposes a method for editing objects with given reference textures. Additionally, NeuralRemaster introduces a phase-preserving diffusion process for structure-aligned generation, and Highly Efficient Test-Time Scaling explores the use of text embedding perturbation to enhance generative diversity and quality.

Sources

PixPerfect: Seamless Latent Diffusion Local Editing with Discriminative Pixel-Space Refinement

Highly Efficient Test-Time Scaling for T2I Diffusion Models with Text Embedding Perturbation

Refa\c{c}ade: Editing Object with Given Reference Texture

NeuralRemaster: Phase-Preserving Diffusion for Structure-Aligned Generation

Built with on top of