The field of dexterous manipulation is moving towards increased autonomy and adaptability in robotic hands. This is being achieved through the development of new sensors, such as tactile skin, and improved methods for fusing multi-sensory data. Self-supervised learning is emerging as a key technique for training these sensors and enabling robots to learn from their interactions with the environment. Noteworthy papers include:
- A paper introducing Sparsh-skin, a pre-trained encoder for magnetic skin sensors that achieves state-of-the-art results in downstream tasks.
- A paper proposing a force-guided attention fusion module that adaptively adjusts the weights of visual and tactile features, achieving an average success rate of 93% across three fine-grained tasks.