The field of robotic systems is moving towards the development of more human-like perception and interaction capabilities, with a focus on multimodal sensing and processing. This includes the integration of tactile, proprioceptive, and thermal signals to enable comprehensive perception and effective interaction with the environment. Recent research has also emphasized the importance of neuromorphic approaches, which mimic the biological mechanisms of the human somatosensory system, to achieve efficient and robust perception in robotic systems. Additionally, there is a growing interest in the development of multimodal classification architectures that can integrate visual and tactile sensory streams to enhance surface understanding and material classification. Noteworthy papers in this area include:
- A paper on a human-inspired soft anthropomorphic hand system that achieves 97.14% accuracy in object recognition across varying poses.
- A paper on Surformer v2, a multimodal classifier that integrates visual and tactile sensory streams through a late decision-level fusion mechanism, demonstrating competitive performance on the Touch and Go dataset.