The field of machine learning is moving towards addressing the challenges of out-of-distribution detection and domain adaptation. Recent research has highlighted the limitations of traditional approaches, such as the assumption of identical distributions for training and testing, and the need for more robust and flexible methods. The concept of domain feature collapse has been identified as a key issue, where models trained on single-domain datasets fail to detect out-of-distribution samples. To address this, researchers are exploring new approaches, such as Temp-SCONE, which introduces a confidence-driven regularization loss to handle temporal shifts in dynamic environments. Noteworthy papers include: Limitations of Using Identical Distributions for Training and Testing When Learning Boolean Functions, which shows that matching distributions is not always the best scenario. Open-Set Domain Adaptation Under Background Distribution Shift provides a provably efficient solution for open-set recognition under background distribution shift. Domain Feature Collapse provides a theoretical explanation for the failure of OOD detection methods and proposes a solution using domain filtering. Temp-SCONE proposes a novel framework for out-of-distribution detection and domain generalization in wild data with temporal shift.