The field of driver monitoring and automated driving systems is rapidly evolving, with a focus on developing more accurate and reliable systems for ensuring driving safety and user experience. Recent research has highlighted the importance of multimodal sensing and human factors in achieving this goal. Studies have shown that the integration of various modalities, such as RGB, near-infrared, and mmWave radar data, can provide a more comprehensive understanding of driver behavior and physiology. Additionally, the consideration of human subjective evaluations and eye-tracking data has been identified as crucial for developing more effective and safe automated driving systems. Noteworthy papers include: PhysDrive, which presents a large-scale multimodal dataset for contactless in-vehicle physiological sensing, and RISEE, which introduces a dataset containing human subjective evaluations and eye-tracking data for naturalistic driving trajectories. Multidimensional Assessment of Takeover Performance in Conditionally Automated Driving evaluates drivers' takeover performance across three dimensions and highlights the distinct yet complementary roles of Situational Awareness and Spare Capacity in shaping performance components.