Advances in Human-Robot Interaction and Accessibility

The field of human-robot interaction is moving towards more personalized and inclusive experiences. Researchers are exploring the use of social robots and telepresence robots to improve accessibility in various settings, such as museums. Emotionally intelligent robots that can understand and respond to human emotions are also being developed. Additionally, there is a growing interest in empowering children to create their own AI-enabled augmented reality experiences. Noteworthy papers include: E-React, which introduces a novel task of generating diverse reaction motions in response to different emotional cues. Capybara, an AR-based and AI-powered visual programming environment that empowers children to create and customize 3D characters. AZRA, a novel augmented reality framework that extends the affective capabilities of zoomorphic robots.

Sources

Social and Telepresence Robots for Accessibility and Inclusion in Small Museums

E-React: Towards Emotionally Controlled Synthesis of Human Reactions

Empowering Children to Create AI-Enabled Augmented Reality Experiences

AZRA: Extending the Affective Capabilities of Zoomorphic Robots using Augmented Reality

Generation of Real-time Robotic Emotional Expressions Learning from Human Demonstration in Mixed Reality

Built with on top of