Advancements in Haptic Interfaces and Robotic Manipulation

The field of haptic interfaces and robotic manipulation is moving towards increased personalization and adaptability. Researchers are focusing on developing modular and reconfigurable systems that can adapt to individual users and tasks, enabling more effective human-machine interaction.

Noteworthy papers include: A Modular Haptic Display with Reconfigurable Signals for Personalized Information Transfer, which presents a customizable soft haptic system for personalized feedback. TensorTouch, which integrates finite element analysis with deep learning to extract comprehensive contact information from optical tactile sensors, enabling advanced dexterous manipulation capabilities. In-Hand Object Pose Estimation via Visual-Tactile Fusion, which combines visual and tactile information to accurately determine the position and orientation of objects grasped by a robotic hand. eFlesh, which introduces a highly customizable magnetic tactile sensor for robotic manipulation, allowing for versatile and accessible tactile sensing. Vib2Move, which uses fingertip micro-vibrations and gravity to precisely reposition planar objects, demonstrating reliable and high-precision manipulation.

Sources

A Modular Haptic Display with Reconfigurable Signals for Personalized Information Transfer

TensorTouch: Calibration of Tactile Sensors for High Resolution Stress Tensor and Deformation for Dexterous Manipulation

EMG-Driven Stiffness-Modulating Palpation for Telerehabilitation

Investigating the Perception of Translational Shape-Changing Haptic Interfaces

eFlesh: Highly customizable Magnetic Touch Sensing using Cut-Cell Microstructures

In-Hand Object Pose Estimation via Visual-Tactile Fusion

Occlusion-Aware 3D Hand-Object Pose Estimation with Masked AutoEncoders

Vib2Move: In-Hand Object Reconfiguration via Fingertip Micro-Vibrations

Built with on top of