Long-Tailed Distribution Research

The field of long-tailed distribution research is moving towards developing innovative methods to address the challenges of class imbalance in real-world datasets. Researchers are exploring new approaches to improve the performance of models on tail classes, including the use of subsampled model soups, hybrid counterfactual-SMOTE algorithms, and binary cross-entropy based tripartite synergistic learning. These methods aim to reduce head-class bias, improve feature compactness and separability, and balance the separability among classifier vectors. Notable papers in this area include: LT-Soups, which proposes a two-stage model soups framework to generalize across diverse long-tailed regimes. BCE3S, which achieves state-of-the-art performance on various long-tailed datasets by using binary cross-entropy based joint learning, contrastive learning, and uniform learning. DirPA, which simulates an unknown label distribution skew of the target domain proactively during model training to address prior shift in few-shot crop-type classification. Improving Long-Tailed Object Detection, which employs a two-stage Faster R-CNN architecture and proposes enhancements to the Balanced Group Softmax framework to mitigate class imbalance.

Sources

LT-Soups: Bridging Head and Tail Classes via Subsampled Model Soups

Augmenting The Weather: A Hybrid Counterfactual-SMOTE Algorithm for Improving Crop Growth Prediction When Climate Changes

BCE3S: Binary Cross-Entropy Based Tripartite Synergistic Learning for Long-tailed Recognition

Mind the Gap: Bridging Prior Shift in Realistic Few-Shot Crop-Type Classification

Improving Long-Tailed Object Detection with Balanced Group Softmax and Metric Learning

Built with on top of