The field of robotic localization and navigation is experiencing significant advancements, driven by the development of innovative methods and techniques. One notable direction is the integration of multiple sensors, such as LiDAR, radar, and cameras, to improve the accuracy and robustness of pose estimation and mapping. Researchers are also exploring the use of adaptive fusion approaches to combine the strengths of different sensors and mitigate the effects of environmental challenges. Furthermore, there is a growing interest in developing perception-aware trajectory planning frameworks that guide robots to avoid degraded areas and improve localization accuracy. Noteworthy papers include:
- SaWa-ML, which proposes a novel visual-inertial-range-based multi-robot localization method that enables geometric structure-aware pose correction and weight adaptation-based robust multi-robot localization.
- All-UWB SLAM Using UWB Radar and UWB AOA, which presents a method for improving the accuracy and scalability of SLAM in feature-deficient environments by incorporating UWB Angle of Arrival measurements.