Dimitrios Chatziparaschis, "Machine learning for enhancing robotic perception and control", Master Thesis, School of Electrical and Computer Engineering, Technical University of Crete, Chania, Greece, 2020
https://doi.org/10.26233/heallink.tuc.87411
In recent years, there is an emerging need to use robotic systems to facilitate human missions, especially in Search-and-Rescue scenarios. Such systems may operate in cluttered and human-unfriendly environments, in which there may be no ideal circumstances to establish a remote control connection and delays may be detrimental due to the emergencyof such scenarios. Henceforth, the most essential trait of these systems is their ability to deal with the uncertainty of the operating environment, in order to make appropriate decisions and accomplish their objectives autonomously. In this thesis, we utilizeMachine Learning approaches to enhance robotic perception and control, namely visionand navigation, of a simulated Unmanned Aerial Vehicle (UAV) to be able to act fully autonomously in reconnaissance and rescue procedures. On the perception side, we use a custom Deconvolutional Neural Network trained on tailor-made loss functions to achieve autonomous visual target detection. On the control side, we applied Deep Reinforcement Learning using Deep Deterministic Policy Gradient, based on a custom lightweight training simulator, to obtain the appropriate autonomous navigation behavior in unknownworlds. The enhanced UAV system can navigate safely through an unknown environment, search and detect any existing humans in its surroundings with its onboard gimbasystem, engage and take distance measurements from the acquired target, and georeference it to the global coordinate system. Thenceforth, the UAV pinpoints the positioned target in the generated map, shares it with the responding team, and proceeds with the exploration of the unmapped area to locate other individuals who may be in need.Throughout this study, each developed autonomous behavior of the UAV was thoroughly evaluated to demonstrate experimental results in various custom environments within the Gazebo robot simulation environment. The proposed system has been developed as a Robot Operating System (ROS) package and is deployable to both simulated and real UAV systems, as long as they meet the minimum proposed software and sensory requirements.