Video acquired using a camera mounted on a mini Unmanned Air Vehicle (mUAV) may be very helpful in Wilderness Search and Rescue and many other applications but is commonly plagued with limited spatial and temporal field of views, distractive jittery motions, disorienting rotations, and noisy and distorted images. These problems collectively make it very difficult for human viewers to identify objects of interest as well as infer correct orientations throughout the video. In order to expand the temporal and spatial field of view, stabilize, and better orient users of noisy and distorted mUAV video, a method is proposed of estimating in software and in real time the relative motions of each frame to the next by tracking a small subset of features within each frame to the next. Using these relative motions, a local Euclidean mosaic of the video can be created and a curve can be fit to the video's accumulative motion path to stabilize the presentations of both the video and the local Euclidean mosaic. The increase in users' abilities to perform common search-and-rescue tasks of identifying objects of interest throughout the stabilized and locally mosaiced mUAV video is then evaluated. Finally, a discussion of remaining limitations is presented along with some possibilities for future work.
College and Department
Physical and Mathematical Sciences; Computer Science
BYU ScholarsArchive Citation
Gerhardt, Damon Dyck, "Feature-based Mini Unmanned Air Vehicle Video Euclidean Stabilization with Local Mosaics" (2007). All Theses and Dissertations. 1056.
feature-based, mini unmanned air vehicle, mUAV, UAV, mUAV vision, UAV vision, video stabilization, video mosaic, Euclidean mosaic, Euclidean stabilization, local mosaic, aerial video, E-mosaic, stable-E, stable-E-mosaic, homography RANSAC filter, short circuit, jitter