Real-time, Dense UAV Mapping by Leveraging Monocular Depth Prediction with Monocular-Inertial SLAM - Pôle Interaction
Article Dans Une Revue Advanced Robotics Année : 2024

Real-time, Dense UAV Mapping by Leveraging Monocular Depth Prediction with Monocular-Inertial SLAM

Résumé

We present a dense and metric 3D mapping pipeline designed for embedded operation on-board UAVs, by loosely coupling deep neural networks trained to infer dense depth single images with a SLAM system that restores metric scale from sparse depth. In contrast to computationally restrictive approaches that leverage multiple views, we propose a highly efficient, single-view approach without sacrificing 3D mapping performance. This enables real-time construction of a global 3D voxel map by iterative fusion of the rescaled dense depth maps obtained via raycasting from the estimated camera poses. Quantitative and qualitative experimentations of our framework in challenging environmental conditions show comparable or superior performance with respect to state-of-the-art approaches via a better effectiveness-efficiency trade-off.
Fichier principal
Vignette du fichier
Habib-AR24-Real-time-Dense-UAV-Mapping-by-Leveraging-Monocular-Depth-Prediction-with-Monocular-Inertial-slam.pdf.pdf (5.28 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04612405 , version 1 (14-06-2024)
hal-04612405 , version 2 (08-08-2024)

Identifiants

Citer

Yassine Habib, Panagiotis Papadakis, Cédric Le Barz, Antoine Fagette, Tiago Gonçalves, et al.. Real-time, Dense UAV Mapping by Leveraging Monocular Depth Prediction with Monocular-Inertial SLAM. Advanced Robotics, 2024, ⟨10.1080/01691864.2024.2415084⟩. ⟨hal-04612405v2⟩
175 Consultations
244 Téléchargements

Altmetric

Partager

More