Implementation of Deep Learning and Motion control Using Drone
Keywords:Hand recognition, deep learning, convolution neural network
In current times, drones have widely used in many sectors such as transportation, photography, army and agriculture. However, the control system of the drone remains the same. Leap is a motion sensor using infrared and a camera to detect a hand signal and hand gesture. A leap motion sensor has been applied to this project as a controller to replace the variation of the control system of the drones. DJI Tello Edu drone has been chosen as Unmanned Aerial Vehicles (UAVs) for this project. A computer has been used as the main system that helps connect the leap sensor and drones via WIFI. This project use python language as the system of the program to control the drones. The main purpose of this project is to control the drones using leap motion and implement all possible gesture to control the drone at any movement. To evaluate the system performance of the drones in retrieving the video from the drone with object detection. There are three methods of object detection applied to this project which is Haar cascade face detection and YOLO V3 tiny object detection. The purpose of a computer visioning system is used to determine the output of video quality by comparing the frame rate for each detection method. The result shows that the leap sensor is successful to control the drones more stable and most all of the hand gesture and movement are applied to control the drone. For object detection, it shows that edge detection produces a higher frame rate compared with YOLO and Haar cascade detection. Overall, the frame rate shows the drone produces a static frame rate when in idle condition. However, when the drones are in flying condition, the frame rate decrease. The overall system still has pros and cons. Some suggestion are given to make sure the project can still improve.