Keywords

sensor fusion, vision-aided inertial navigation, multiplicative extended Kalman filter, aerial robotics

Abstract

This work presents a multiplicative extended Kalman filter for estimating the relative state of a multirotor vehicle operating in a GPS-denied environment. The filter fuses data from an inertial measurement unit and altimeter with relative-pose updates from a keyframe-based visual odometry or laser scan-matching algorithm. Because the global position and heading states of the vehicle are unobservable in the absence of global measurements such as GPS, the filter in this paper estimates the state with respect to a local frame that is colocated with the odometry keyframe. As a result, the odometry update provides nearly-direct measurements of the relative vehicle pose, making those states observable. The derivation in this paper is intended to provide a complete, self-contained presentation that is tutorial in nature and that clearly defines all quaternion conventions and properties used. This paper also contains several improvements and extensions to the filter, and includes several new useful properties relating to error quaternions and their Euler-angle decomposition. Finally, this paper derives the filter both for traditional dynamics defined with respect to an inertial frame, and for robocentric dynamics defined with respect to the vehicle’s body frame, and provides insights into the subtle differences that arise between the two formulations.

Document Type

Working Paper

Publication Date

2017-08-18

Permanent URL

http://hdl.lib.byu.edu/1877/3919

Language

English

College

Ira A. Fulton College of Engineering and Technology

Department

Mechanical Engineering

University Standing at Time of Publication

Graduate Student

Share

COinS