The field of haptic interfaces and robotic manipulation is moving towards increased personalization and adaptability. Researchers are focusing on developing modular and reconfigurable systems that can adapt to individual users and tasks, enabling more effective human-machine interaction.
Noteworthy papers include: A Modular Haptic Display with Reconfigurable Signals for Personalized Information Transfer, which presents a customizable soft haptic system for personalized feedback. TensorTouch, which integrates finite element analysis with deep learning to extract comprehensive contact information from optical tactile sensors, enabling advanced dexterous manipulation capabilities. In-Hand Object Pose Estimation via Visual-Tactile Fusion, which combines visual and tactile information to accurately determine the position and orientation of objects grasped by a robotic hand. eFlesh, which introduces a highly customizable magnetic tactile sensor for robotic manipulation, allowing for versatile and accessible tactile sensing. Vib2Move, which uses fingertip micro-vibrations and gravity to precisely reposition planar objects, demonstrating reliable and high-precision manipulation.