Multimodal Approaches in Affective Computing and Human-Computer Interaction

The field of affective computing and human-computer interaction is moving towards the development of more generalizable and robust models that can accurately detect and recognize human emotions, cognitive load, and other physiological signals. Recent research has focused on multimodal approaches, combining different modalities such as EEG, eye movement, facial expression, and physiological signals to improve the accuracy and reliability of emotion recognition and cognitive load detection systems. These approaches have shown promising results, with some studies achieving high accuracy rates in emotion recognition and cognitive load detection. Notably, the use of deep learning models and multimodal fusion techniques has been particularly effective in improving the performance of these systems.

Some noteworthy papers in this area include: REVELIO, which introduces a new multimodal dataset for task load detection and evaluates the performance of state-of-the-art models on multiple modalities and application domains. MuMTAffect, which presents a novel multimodal multitask affective framework for personality and emotion recognition from physiological signals. ADHDeepNet, which proposes a deep learning model for improving ADHD diagnosis precision and timeliness using raw EEG signals.

Sources

REVELIO -- Universal Multimodal Task Load Estimation for Cross-Domain Generalization

MuMTAffect: A Multimodal Multitask Affective Framework for Personality and Emotion Recognition from Physiological Signals

Facial Emotion Recognition does not detect feeling unsafe in automated driving

Combine Virtual Reality and Machine-Learning to Identify the Presence of Dyslexia: A Cross-Linguistic Approach

An Emotion Recognition Framework via Cross-modal Alignment of EEG and Eye Movement Data

Detecting Blinks in Healthy and Parkinson's EEG: A Deep Learning Perspective

MVRS: The Multimodal Virtual Reality Stimuli-based Emotion Recognition Dataset

Systematic Evaluation of Multi-modal Approaches to Complex Player Profile Classification

BEAM: Brainwave Empathy Assessment Model for Early Childhood

ADHDeepNet From Raw EEG to Diagnosis: Improving ADHD Diagnosis through Temporal-Spatial Processing, Adaptive Attention Mechanisms, and Explainability in Raw EEG Signals

Built with on top of