Emotion-Aware AI Systems

The field of AI research is moving towards developing more empathic and emotionally intelligent systems. This is evident in the focus on creating datasets and models that can recognize and respond to complex emotional cues, such as verbal-visual incongruence. Researchers are also exploring the use of multimodal sentiment perception and fusion to create more immersive and empathetic interactions. Additionally, there is a growing interest in developing personalized emotion recognition systems that can adapt to individual users. Noteworthy papers in this area include: E-THER, which presents a novel dataset for benchmarking empathic AI systems, and Livia, which introduces an emotion-aware AR companion app that provides personalized emotional support. AIVA and Talking Spell are also notable for their contributions to emotion-aware interaction and anthropomorphic voice interaction.

Sources

E-THER: A PCT-Grounded Dataset for Benchmarking Empathic AI

Talking Spell: A Wearable System Enabling Real-Time Anthropomorphic Voice Interaction with Everyday Objects

AIVA: An AI-based Virtual Companion for Emotion-aware Interaction

Discrete Prompt Tuning via Recursive Utilization of Black-box Multimodal Large Language Model for Personalized Visual Emotion Recognition

Livia: An Emotion-Aware AR Companion Powered by Modular AI Agents and Progressive Memory Compression

Built with on top of