Keywords

unmanned aircraft, UAV, deep learning, attitude estimation

Abstract

This paper demonstrates a feasible method for using a deep neural network as a sensor to estimate the attitude of a flying vehicle using only flight video. A dataset of still images and associated gravity vectors was collected and used to perform supervised learning. The network builds on a previously trained network and was trained to be able to approximate the attitude of the camera with an average error of about 8 degrees. Flight test video was recorded and processed with a relatively simple visual odometry method. The aircraft attitude is then estimated with the visual odometry as the state propagation and network providing the attitude measurement in an extended Kalman filter. Results show that the proposed method of having the neural network provide a gravity vector attitude measurement from the flight imagery reduces the standard deviation of the attitude error by approximately 12 times compared to a baseline approach.

Original Publication Citation

2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, September 2017, Vancouver, British Columbia.

Document Type

Peer-Reviewed Article

Publication Date

2017-9

Permanent URL

http://hdl.lib.byu.edu/1877/3925

Publisher

IEEE

Language

English

College

Ira A. Fulton College of Engineering and Technology

Department

Mechanical Engineering

University Standing at Time of Publication

Graduate Student

Share

COinS