Advancements in Sensor Integration and Robotics

The field of robotics and sensor integration is moving towards more sophisticated and robust systems, with a focus on enhancing accuracy and reliability in challenging environments. Recent developments have seen the integration of multiple sensor modalities, such as visual, inertial, and acoustic sensors, to improve navigation and mapping capabilities. Additionally, there has been a push towards more autonomous systems, with the development of novel control frameworks and algorithms that enable robots to operate effectively in complex and dynamic environments. Noteworthy papers in this area include: Underwater Visual-Inertial-Acoustic-Depth SLAM with DVL Preintegration for Degraded Environments, which proposes a novel SLAM system that integrates multiple sensor modalities for robust underwater navigation. EndoSfM3D: Learning to 3D Reconstruct Any Endoscopic Surgery Scene using Self-supervised Foundation Model, which presents a self-supervised monocular depth estimation framework for 3D reconstruction of endoscopic surgery scenes. Localising under the drape: proprioception in the era of distributed surgical robotic system, which introduces a marker-free proprioception method for precise localization of surgical robots under their sterile draping.

Sources

Underwater Visual-Inertial-Acoustic-Depth SLAM with DVL Preintegration for Degraded Environments

Force-Displacement Profiling for Robot-Assisted Deployment of a Left Atrial Appendage Occluder Using FBG-EM Distal Sensing

EndoSfM3D: Learning to 3D Reconstruct Any Endoscopic Surgery Scene using Self-supervised Foundation Model

Precise Time Delay Measurement and Compensation for Tightly Coupled Underwater SINS/piUSBL Navigation

PlanarTrack: A high-quality and challenging benchmark for large-scale planar object tracking

Localising under the drape: proprioception in the era of distributed surgical robotic system

InFlux: A Benchmark for Self-Calibration of Dynamic Intrinsics of Video Cameras

Hybrid Vision Servoing with Depp Alignment and GRU-Based Occlusion Recovery

Data-Enabled Predictive Control and Guidance for Autonomous Underwater Vehicles

Seeing Clearly and Deeply: An RGBD Imaging Approach with a Bio-inspired Monocentric Design

SPADE: Sparsity Adaptive Depth Estimator for Zero-Shot, Real-Time, Monocular Depth Estimation in Underwater Environments

Built with on top of