Home > Published Issues > 2020 > Volume 9, No. 1, January 2020 >

Real-Time Multi Target Capturing Using Partitioning in Robot Vision

Davood Pour Yousefian Barfeh 1, Patrice Xandria Mari A. Delos Reyes 2, and Myrna A. Coliat 3
1. College of Engineering and Computer Studies, Lyceum of the Philippines University-Laguna, Calamba, Philippines
College of Informatics and Computing Sciences, Batangas State University, Batangas, Philippines
2. Graduate School, University of the Philippines-Los Baños, Philippines
3. College of Informatics and Computing Sciences, Batangas State University, Batangas, Philippines

Abstract— In this study, the authors design and implement a real time system as an autonomous robot–camera to capture many targets in the scene. The robot has only one camera, but it is capable of capturing more than one moving object through proper movement. The system uses Gaussian Filtering for motion detection and then performs partitioning to grab location of all targets in the scene. Due to partitioning, the scene has three major regions while each of which has different sub-regions. Based on the partitioning and position of all targets, the system might be in three states of unsafe state, safe state, and over-safe state. In each state regarding specific regions or sub-regions, the system picks appropriate movement not only to be capable of capturing all moving objects, but also to give equal chance of capturing to new targets entering to the scene from different direction. The system is tested in both of indoor and outdoor with different values for different parameters such as resolutions, fps (frame-per-second), minimum number of motion frames, and minimum areas of motion. 

Index Terms— Motion detection, Gaussian filtering, multi target tracking, moving objects, partitioning, digital image processing

Cite: Davood Pour Yousefian Barfeh, Patrice Xandria Mari A. Delos Reyes, and Myrna A. Coliat, "Real-Time Multi Target Capturing Using Partitioning in Robot Vision" International Journal of Mechanical Engineering and Robotics Research, Vol. 9, No. 1, pp. 117-121, January 2020. DOI: 10.18178/ijmerr.9.1.117-121

Copyright © 2020 by the authors. This is an open access article distributed under the Creative Commons Attribution License (CC BY-NC-ND 4.0), which permits use, distribution and reproduction in any medium, provided that the article is properly cited, the use is non-commercial and no modifications or adaptations are made.