The field of person re-identification and identity-preserving generation is moving towards more sophisticated and controllable models. Researchers are exploring new approaches to address the challenges of modality discrepancies, data privacy, and annotation costs. A key direction is the development of unified pipelines that can generate high-quality images and videos while preserving identity consistency. Another important trend is the use of debiasing techniques to reduce modality bias and improve the generalization of models. Noteworthy papers in this area include:
- OmniPerson, which introduces a unified identity-preserving pedestrian generation pipeline that achieves state-of-the-art results in pedestrian generation and excels in both visual fidelity and identity consistency.
- Dual-level Modality Debiasing Learning, which proposes a framework that implements debiasing at both the model and optimization levels to tackle modality discrepancy in unsupervised visible-infrared person re-identification.
- Not All Birds Look The Same, which presents a benchmark for evaluating identity-preserving generation of birds and demonstrates the limitations of current models in this domain.
- Identity Clue Refinement and Enhancement, which proposes a novel network to mine and utilize modality-specific identity-aware knowledge for visible-infrared person re-identification.