Advancements in Robotic Perception and Sensing

The field of robotic perception and sensing is moving towards more accurate and efficient methods of data representation and processing. Researchers are exploring new techniques for transforming tactile sensor data into structured formats, enabling the use of high-level computation methods from other domains. Additionally, there is a growing interest in developing innovative sensing technologies, such as wireless strain and temperature sensing in 3D-printed metal structures, and multi-material 3D-printed tactile sensors with hierarchical infill structures. These advancements have the potential to improve the accuracy and reliability of robotic systems in various applications, including object recognition, locomotion, and maintenance. Noteworthy papers include:

  • Simultaneous Calibration of Noise Covariance and Kinematics for State Estimation of Legged Robots via Bi-level Optimization, which introduces a bi-level optimization framework for joint calibration of covariance matrices and kinematic parameters.
  • Proprioceptive Image: An Image Representation of Proprioceptive Data from Quadruped Robots for Contact Estimation Learning, which presents a novel approach for representing proprioceptive time-series data as structured two-dimensional images.

Sources

Representing Data in Robotic Tactile Perception -- A Review

Simultaneous Calibration of Noise Covariance and Kinematics for State Estimation of Legged Robots via Bi-level Optimization

Wireless Sensing of Temperature, Strain and Crack Growth in 3D-Printed Metal Structures via Magnetoelastic and Thermomagnetic Inclusions

M3D-skin: Multi-material 3D-printed Tactile Sensor with Hierarchical Infill Structures for Pressure Sensing

Two-stream network-driven vision-based tactile sensor for object feature extraction and fusion perception

Proprioceptive Image: An Image Representation of Proprioceptive Data from Quadruped Robots for Contact Estimation Learning

Built with on top of