The field of AI research is moving towards developing more empathic and emotionally intelligent systems. This is evident in the focus on creating datasets and models that can recognize and respond to complex emotional cues, such as verbal-visual incongruence. Researchers are also exploring the use of multimodal sentiment perception and fusion to create more immersive and empathetic interactions. Additionally, there is a growing interest in developing personalized emotion recognition systems that can adapt to individual users. Noteworthy papers in this area include: E-THER, which presents a novel dataset for benchmarking empathic AI systems, and Livia, which introduces an emotion-aware AR companion app that provides personalized emotional support. AIVA and Talking Spell are also notable for their contributions to emotion-aware interaction and anthropomorphic voice interaction.