Advances in Imbalanced Classification

The field of imbalanced classification is moving towards more adaptive and dynamic approaches to handling class imbalance. Researchers are exploring new methods that can adjust to changes in class-wise learning difficulty, allowing models to focus on underperforming classes and improve overall performance. Notably, innovative techniques such as adaptive resampling, group-aware threshold calibration, and quantum-inspired oversampling are being developed to address the challenges of class imbalance. These approaches have shown promising results in various benchmarks and datasets, demonstrating their potential to advance the state-of-the-art in imbalanced classification. Noteworthy papers in this area include: ART, which proposes an adaptive resampling-based training method that updates the distribution of the training data based on class-wise performance. Extrapolated Markov Chain Oversampling Method, which introduces a novel Markov chain-based text oversampling method that expands the minority feature space. Beyond Synthetic Augmentation, which demonstrates the effectiveness of group-aware threshold calibration in achieving robust balanced accuracy. QI-SMOTE, which leverages quantum principles to generate synthetic instances that preserve complex data structures. AxelSMOTE, which implements an agent-based approach that views data instances as autonomous agents engaging in complex interactions.

Sources

ART: Adaptive Resampling-based Training for Imbalanced Classification

Extrapolated Markov Chain Oversampling Method for Imbalanced Text Classification

Beyond Synthetic Augmentation: Group-Aware Threshold Calibration for Robust Balanced Accuracy in Imbalanced Learning

Enhancing Machine Learning for Imbalanced Medical Data: A Quantum-Inspired Approach to Synthetic Oversampling (QI-SMOTE)

AxelSMOTE: An Agent-Based Oversampling Algorithm for Imbalanced Classification

Built with on top of