Keywords

Deep Learning in Robotics and Automation, Aerial Systems, Perception and Autonomy

Abstract

Human remote-control (RC) pilots have the ability to perceive the position and orientation of an aircraft using only third-person-perspective visual sensing. While novice pilots often struggle when learning to control RC aircraft, they can sense the orientation of the aircraft with relative ease. In this paper, we hypothesize and demonstrate that deep learning methods can be used to mimic the human ability to perceive the orientation of an aircraft from monocular imagery.

This work uses a neural network to directly sense the aircraft attitude. The network is combined with more conventional image processing methods for visual tracking of the aircraft. The aircraft track and attitude measurements from the convolutional neural network (CNN) are combined in a particle filter that provides a complete state estimate of the aircraft. The network topology, training, and testing results are presented as well as filter development and results. The proposed method was tested in simulation and hardware flight demonstrations.

Document Type

Peer-Reviewed Article

Publication Date

2018-09-19

Language

English

College

Ira A. Fulton College of Engineering and Technology

Department

Mechanical Engineering

University Standing at Time of Publication

Graduate Student

Share

COinS