Home > Published Issues > 2020 > Volume 9, No. 12, December 2020 >

Survey and Experimental Comparison of RGB-D Indoor Robot Navigation Methods Supported by ROS and Their Expansion via Fusion with Wheel Odometry and IMU Data

Florian Spiess 1, Jonas Friesslich 1, Tobias Kaupp 1, Samuel Kounev 2, and Norbert Strobel 3
1. Faculty of Electrical Engineering and Information Technology University of Applied Sciences Wuerzburg - Schweinfurt 97421, Schweinfurt, Germany
2. Faculty of Mathematics and Computer Science Julius-Maximilians-University of Wuerzburg 97074, Wuerzburg, Germany
3. Institute of Medical Engineering, University of Applied Sciences Wuerzburg - Schweinfurt, 97421, Schweinfurt, Germany

Abstract—This paper presents an experimental evaluation and comparison of selected Visual Odometry (VO) and Visual-SLAM (V-SLAM) algorithms for indoor mobile robot navigation supported by the Robot Operating System (ROS). The focus is on algorithms involving RGB-D cameras. Since RGB-D cameras integrate color and depth information, they output coherent measurement data and facilitate an efficient processing pipeline. The various underlying methods of vision-based algorithms are described and evaluated on two datasets covering different indoor situations as well as various lighting and movement conditions. In general, V-SLAM algorithms yielded better results. They were superior with respect to handling drift, in particular when loop closures were involved. However, the results confirmed that VO algorithms could outperform V-SLAM methods under certain circumstances. This happened when there was a very good match between an algorithm’s design objectives and the situation at hand. While the experiments showed that there is no single best algorithm for every scenario, ORB-SLAM2 is recommended as a robust stand-alone RGB-D based localization method available under ROS. Furthermore, we observed that the position estimation error could be reduced by around 67% on average when combining vision-based position estimates with sensor data obtained from wheel odometry and an inertial measurement unit (IMU), respectively. This clearly demonstrates the potential of sensor fusion techniques. The best results in case of sensor fusion were obtained with RGB-DSLAMv2. 

 
Index Terms—mobile robots, multisensor data fusion,data sets for robotic vision, RGB-D perception

Cite: Florian Spiess, Jonas Friesslich, Tobias Kaupp, Samuel Kounev, and Norbert Strobel, "Survey and Experimental Comparison of RGB-D Indoor Robot Navigation Methods Supported by ROS and Their Expansion via Fusion with Wheel Odometry and IMU Data," International Journal of Mechanical Engineering and Robotics Research, Vol. 9, No. 12, pp. 1532-1540, December 2020. DOI: 10.18178/ijmerr.9.12.1532-1540

Copyright © 2020 by the authors. This is an open access article distributed under the Creative Commons Attribution License (CC BY-NC-ND 4.0), which permits use, distribution and reproduction in any medium, provided that the article is properly cited, the use is non-commercial and no modifications or adaptations are made.