The field of probabilistic regression and uncertainty quantification is rapidly advancing, with a focus on developing more accurate and reliable models. Recent work has highlighted the importance of transfer learning and normalizing flows in improving predictive performance. Additionally, there is a growing recognition of the need for better calibration and evaluation metrics, as well as more effective methods for aligning predictive models with downstream tasks. Noteworthy papers in this area include: Probabilistic Pretraining for Neural Regression, which introduces a new model for transfer learning in probabilistic regression. TabResFlow, which proposes a normalizing spline flow model for univariate tabular regression and demonstrates its effectiveness in improving likelihood scores and inference time. A Novel Framework for Uncertainty Quantification via Proper Scores, which presents a general framework for uncertainty quantification based on proper scores and applies it to various tasks, including classification and generative modeling.