Home > Articles > All Issues > 2026 > Volume 15, No. 1, 2026 >
IJMERR 2026 Vol.15(1):102-113
doi: 10.18178/ijmerr.15.1.102-113

Developing a Vision-Guided Tracked Robot for Fire Emergency Missions

Pham Thuc Anh Nguyen 1 , An Hai Nguyen 2, and Son Hoang 3,*
1. Department of Automation Engineering, School of Electrical and Electronic Engineering, Hanoi University of Science and Technology, Hanoi, Vietnam
2. Control, Automation in Production and Improvement of Technology Institute, Academy of Military Science and Technology, Hanoi, Vietnam
3. Faculty of Electronics Engineering 1, Posts and Telecommunications Institute of Technology, Hanoi, Vietnam
Email: anh.nguyenphamthuc@hust.edu.vn (P.T.A.N.); anhaicpt@gmail.com (A.H.N.);
hoangson@ptit.edu.vn (S.H.)
*Corresponding author

Manuscript received August 22, 2025; revised September 12, 2025; accepted October 20, 2025; published February 28, 2026

Abstract—Emergency fire suppression activities subject rescue personnel to severe thermal conditions, hazardous fumes, and blast risks, creating extremely perilous environments for human operators. Rapid urban development has amplified fire emergency occurrences, necessitating the deployment of advanced autonomous firefighting platforms. This study presents an innovative tracked firefighting robot designed to navigate complex terrain and autonomously detect and approach fire sources. The system integrates a You Only Look Once version 8 (YOLOv8)-based deep learning model for real-time fire detection and employs depth imaging to calculate angular deviation and distance to the fire. These measurements are transmitted to a Programmable Logic Controller (PLC)-based control unit via a Modbus RS485 interface for responsive control. To enable autonomous navigation, the proposed robot combines an enhanced Bug-2 pathfinding algorithm with LiDAR-based environmental mapping and Hector Simultaneous Localization and Mapping (SLAM) for real-time localization and mapping. The core innovation lies in the integration of YOLOv8-based fire detection with deviation-angle-optimized Bug-2 navigation and a PLC-Robot Operating System (ROS) control architecture, enabling precise fire localization and obstacle avoidance in dynamic environments. Experimental validation confirms the effectiveness of the proposed firefighting robot in identifying fire sources and navigating around obstacles, demonstrating its potential as a reliable solution for autonomous firefighting in hazardous scenarios.

Keywords—autonomous firefighting systems, robotic fire suppression, You Only Look Once version 8 (YOLOv8) neural networks, obstacle navigation, flame detection

Cite: Pham Thuc Anh Nguyen, An Hai Nguyen, and Son Hoang, "Developing a Vision-Guided Tracked Robot for Fire Emergency Missions," International Journal of Mechanical Engineering and Robotics Research, Vol. 15, No. 1, pp. 102-113, 2026. doi: 10.18178/ijmerr.15.1.102-113

Copyright © 2026 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Article Metrics in Dimensions