Dances with Drones: Using Google’s TFLite for Autonomous Control of Aerial Drones by Gesture Recognition

Erin Linebarger, University of Utah (US) and SFB 1294, University of Potsdam 2.9.0.1410:15 - 11:45

Scientific app development for implementing robotics controllers and machine learning algorithms has become much easier with the introduction of tools such as Google's TFLite. With this package, a gesture recognition controller can be developed to command an autonomous aerial vehicle, making it fly upward or do a flip, for example. The current work involves gesture recognition by a Parrot Bebop, an aerial drone that can be controlled by app. Parrot restricts access to the onboard graphics processor, so the image processing must be done on the phone's processor. Gesture recognition here uses an existing model, a MobileNet convolutional neural net that recognizes 1000 images from the ImageNet dataset, and retrains it with transfer learning methods. The training runs on a computer, but the final model must run on a mobile phone to be useful for the Parrot Bebop. This is accomplished using Google's TFLite, a very useful package for scientific app developers that allows one to develop machine learning based apps with ease.