Advancements in Robotic Localization and Navigation

The field of robotic localization and navigation is experiencing significant advancements, driven by the development of innovative methods and techniques. One notable direction is the integration of multiple sensors, such as LiDAR, radar, and cameras, to improve the accuracy and robustness of pose estimation and mapping. Researchers are also exploring the use of adaptive fusion approaches to combine the strengths of different sensors and mitigate the effects of environmental challenges. Furthermore, there is a growing interest in developing perception-aware trajectory planning frameworks that guide robots to avoid degraded areas and improve localization accuracy. Noteworthy papers include:

  • SaWa-ML, which proposes a novel visual-inertial-range-based multi-robot localization method that enables geometric structure-aware pose correction and weight adaptation-based robust multi-robot localization.
  • All-UWB SLAM Using UWB Radar and UWB AOA, which presents a method for improving the accuracy and scalability of SLAM in feature-deficient environments by incorporating UWB Angle of Arrival measurements.

Sources

SaWa-ML: Structure-Aware Pose Correction and Weight Adaptation-Based Robust Multi-Robot Localization

All-UWB SLAM Using UWB Radar and UWB AOA

Robots for Kiwifruit Harvesting and Pollination

Dense-depth map guided deep Lidar-Visual Odometry with Sparse Point Clouds and Images

A Comprehensive Evaluation of LiDAR Odometry Techniques

GFM-Planner: Perception-Aware Trajectory Planning with Geometric Feature Metric

When and Where Localization Fails: An Analysis of the Iterative Closest Point in Evolving Environment

Online Submission and Evaluation System Design for Competition Operations

Modular Robot and Landmark Localisation Using Relative Bearing Measurements

AF-RLIO: Adaptive Fusion of Radar-LiDAR-Inertial Information for Robust Odometry in Challenging Environments

Built with on top of