Short Title: Int. J. Mech. Eng. Robot. Res.
Frequency: Bimonthly
Professor of School of Engineering, Design and Built Environment, Western Sydney University, Australia. His research interests cover Industry 4.0, Additive Manufacturing, Advanced Engineering Materials and Structures (Metals and Composites), Multi-scale Modelling of Materials and Structures, Metal Forming and Metal Surface Treatment.
2024-10-25
2024-09-24
Manuscript received July 21, 2023; revised September 7, 2023; accepted October 10, 2023; published March 22, 2024.
Abstract—Intelligent robotics is gaining significance in Maintenance, Repair, and Overhaul (MRO) hangar operations, where mobile robots navigate complex and dynamic environments for Aircraft visual inspection. Aircraft hangars are usually busy and changing, with objects of varying shapes and sizes presenting harsh obstacles and conditions that can lead to potential collisions and safety hazards. This makes Obstacle detection and avoidance critical for safe and efficient robot navigation tasks. Conventional methods have been applied with computational issues, while learning-based approaches are limited in detection accuracy. This paper proposes a vision-based navigation model that integrates a pre-trained Yolov5 object detection model into a Robot Operating System (ROS) navigation stack to optimise obstacle detection and avoidance in a complex environment. The experiment is validated and evaluated in ROS-Gazebo simulation and turtlebot3 waffle-pi robot platform. The results showed that the robot can increasingly detect and avoid obstacles without colliding while navigating through different checkpoints to the target location. Keywords—autonomous navigation, object detection, obstacle avoidance, mobile robot, deep learningCite: Ndidiamaka Adiuku, Nicolas P. Avdelidis, Gilbert Tang, Angelos Plastropoulos, and Yanis Diallo, "Mobile Robot Obstacle Detection and Avoidance with NAV-YOLO," International Journal of Mechanical Engineering and Robotics Research, Vol. 13, No. 2, pp. 219-226, 2024.Copyright © 2024 by the authors. This is an open access article distributed under the Creative Commons Attribution License (CC BY-NC-ND 4.0), which permits use, distribution and reproduction in any medium, provided that the article is properly cited, the use is non-commercial and no modifications or adaptations are made.