The field of human sensing and activity recognition is witnessing significant advancements with the development of innovative methods and models. One of the key trends is the increasing focus on efficient and effective transfer learning techniques, which enable the adaptation of pre-trained models to new tasks and modalities with limited data. Additionally, there is a growing interest in multimodal learning approaches, which can integrate data from multiple sources and sensors to improve the accuracy and robustness of human activity recognition systems. Noteworthy papers include: XTransfer, which proposes a novel method for resource-efficient, modality-agnostic model transfer, achieving state-of-the-art performance on human sensing tasks while reducing costs. Smooth-Distill, which introduces a self-distillation framework for multitask learning with wearable sensor data, demonstrating improved performance and stability in human activity recognition and sensor placement detection tasks.