Advances in Probabilistic Regression and Uncertainty Quantification

The field of probabilistic regression and uncertainty quantification is rapidly advancing, with a focus on developing more accurate and reliable models. Recent work has highlighted the importance of transfer learning and normalizing flows in improving predictive performance. Additionally, there is a growing recognition of the need for better calibration and evaluation metrics, as well as more effective methods for aligning predictive models with downstream tasks. Noteworthy papers in this area include: Probabilistic Pretraining for Neural Regression, which introduces a new model for transfer learning in probabilistic regression. TabResFlow, which proposes a normalizing spline flow model for univariate tabular regression and demonstrates its effectiveness in improving likelihood scores and inference time. A Novel Framework for Uncertainty Quantification via Proper Scores, which presents a general framework for uncertainty quantification based on proper scores and applies it to various tasks, including classification and generative modeling.

Sources

Probabilistic Pretraining for Neural Regression

TabResFlow: A Normalizing Spline Flow Model for Probabilistic Univariate Tabular Regression

Evaluating the Quality of the Quantified Uncertainty for (Re)Calibration of Data-Driven Regression Models

A Novel Framework for Uncertainty Quantification via Proper Scores for Classification and Beyond

Aligning the Evaluation of Probabilistic Predictions with Downstream Value

Does Calibration Affect Human Actions?

ALSA: Anchors in Logit Space for Out-of-Distribution Accuracy Estimation

Built with on top of