Exploring the Performance of a Sensor-Fusion-based Navigation System for Human Following Companion Robots

Mark Tee Kit Tsun, Bee Theng Lau, and Hudyjaya Siswoyo Jo
Swinburne University of Technology Sarawak, Malaysia
Abstract— One of the biggest challenges in implementing assistive companion robots is the ability to navigate around obstacles while being visually tethered to a human subject. This is further complicated when advanced hardware and computation-heavy algorithms such as Light Detection and Ranging (LiDAR) modules or Simultaneous Localization and Mapping (SLAM) are not readily available. This research aims to prove the validity of a robot navigation model that relies on multi-sensor fusion of a depth camera, proximity sensors array and an active IR Marker tracking system, all of which consist of commercial off the shelf (COTS) components. Common indoor robot navigation solutions rely on prior environmental mapping to be able to plot routes beyond obstacles in the immediate vicinity. This model differentiates itself by considering the general direction of the target person and the mid-range depth landscape in addition to the immediate vicinity of the robot. To examine its performance, a set of three scenarios were created to emulate the testing conditions of several similar robot navigation studies presented by existing literature. The simulation results show that the implemented navigation system can maintain a consistent distance from the target while traversing a route that is shorter and less impeded by obstructions compared to the benchmark studies.

Index Terms— human-robot interaction, human-following, indoor navigation, sensor fusion, vision-based

Cite: Mark Tee Kit Tsun, Bee Theng Lau, and Hudyjaya Siswoyo Jo, "Exploring the Performance of a Sensor-Fusion-based Navigation System for Human Following Companion Robots," International Journal of Mechanical Engineering and Robotics Research, Vol. 7, No. 6, pp. 590-598, November 2018. DOI: 10.18178/ijmerr.7.6.590-598
Copyright © 2016-2017 International Journal of Mechanical Engineering and Robotics Research, All Rights Reserved
E-mail: ijmerr@ejournal.net