Advances in Emotion Recognition and Multimodal Interaction

The field of emotion recognition and multimodal interaction is moving towards more inclusive and diverse solutions, with a focus on developing intelligent educational tools and accessible technologies for individuals with disabilities. Recent studies have explored the use of sign language recognition, affect mining techniques, and multimodal datasets to improve emotion understanding and recognition. Noteworthy papers in this area include the development of a Palestinian sign language recognition system, which achieved an accuracy of 97.59% in recognizing mathematical signs, and the introduction of EmoSign, a multimodal dataset for understanding emotions in American Sign Language. The EmotionTalk dataset, which provides rich annotations for Chinese multimodal emotion recognition, is also a significant contribution to the field. Additionally, research on head motion patterns as generalisable depression biomarkers and the development of supervised contrastive learning approaches, such as EmotionRankCLAP, demonstrate the innovative and advancing nature of this field.

Sources

Enhancing Mathematics Learning for Hard-of-Hearing Students Through Real-Time Palestinian Sign Language Recognition: A New Dataset

EmoSign: A Multimodal Dataset for Understanding Emotions in American Sign Language

Investigating Affect Mining Techniques for Annotation Sample Selection in the Creation of Finnish Affective Speech Corpus

Developing a Top-tier Framework in Naturalistic Conditions Challenge for Categorized Emotion Prediction: From Speech Foundation Models and Learning Objective to Data Augmentation and Engineering Choices

Predicting Human Depression with Hybrid Data Acquisition utilizing Physical Activity Sensing and Social Media Feeds

EmotionTalk: An Interactive Chinese Multimodal Emotion Dataset With Rich Annotations

On the Validity of Head Motion Patterns as Generalisable Depression Biomarkers

EmotionRankCLAP: Bridging Natural Language Speaking Styles and Ordinal Speech Emotion via Rank-N-Contrast

Built with on top of