Advancements in Deep Learning with Noisy Labels and Data Augmentation

Introduction

The field of deep learning is continually evolving, with recent developments focusing on improving the robustness and efficiency of models in the presence of noisy labels and diverse data sources.

General Direction

Research is moving towards the development of novel loss functions and data loading strategies that can effectively handle noisy labels and non-stationary conditions. This includes the use of asymmetric losses, which have shown superior properties compared to traditional symmetric losses. Additionally, there is a growing interest in leveraging temporal dynamics and cognitive psychology-inspired approaches to enhance model generalization and reduce overfitting.

Noteworthy Papers

  • The introduction of selective embedding as a data loading strategy has shown promising results in improving model performance and reducing computational costs.
  • The proposal of ChronoSelect, a framework that utilizes a four-stage memory architecture to compress prediction history into compact temporal distributions, has demonstrated state-of-the-art performance in learning with noisy labels.

Sources

Selective Embedding for Deep Learning

Combined Image Data Augmentations diminish the benefits of Adaptive Label Smoothing

Joint Asymmetric Loss for Learning with Noisy Labels

ChronoSelect: Robust Learning with Noisy Labels via Dynamics Temporal Memory

Built with on top of