unmanned aircraft, UAV, deep learning, attitude estimation
This paper demonstrates a feasible method for using a deep neural network as a sensor to estimate the attitude of a flying vehicle using only flight video. A dataset of still images and associated gravity vectors was collected and used to perform supervised learning. The network builds on a previously trained network and was trained to be able to approximate the attitude of the camera with an average error of about 8 degrees. Flight test video was recorded and processed with a relatively simple visual odometry method. The aircraft attitude is then estimated with the visual odometry as the state propagation and network providing the attitude measurement in an extended Kalman filter. Results show that the proposed method of having the neural network provide a gravity vector attitude measurement from the flight imagery reduces the standard deviation of the attitude error by approximately 12 times compared to a baseline approach.
Original Publication Citation
2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, September 2017, Vancouver, British Columbia.
BYU ScholarsArchive Citation
Ellingson, Gary J.; Wingate, David; and McLain, Tim, "Deep Visual Gravity Vector Detection for Unmanned Aircraft Attitude Estimation" (2017). All Faculty Publications. 1971.
Ira A. Fulton College of Engineering and Technology
Copyright Use Information