Abstract

This thesis is concerned with characterizing the quality of Hidden Markov modeling when learning from limited data. It introduces a new perspective on different sources of errors to describe the impact of undermodeling. Our view is that modeling errors can be decomposed into two primary sources of errors: the approximation error and the estimation error. This thesis takes a first step towards exploring the approximation error of low order HMMs that best approximate the true system of a HMM. We introduce the notion minimality and show that best approximations of the true system with complexity greater or equal to the order of a minimal system are actually equivalent realizations. Understanding this further allows us to explore integer lumping and to present a new way named weighted lumping to find realizations. We also show that best approximations of order strictly less than that of a minimal realization are truly approximations; they are incapable of mimicking the true system exactly. Our work then proves that the resulting approximation error is non-decreasing as the model order decreases, verifying the intuitive idea that increasingly simplified models are less and less descriptive of the true system.

Degree

MS

College and Department

Physical and Mathematical Sciences; Computer Science

Rights

http://lib.byu.edu/about/copyright/

Date Submitted

2006-07-04

Document Type

Thesis

Handle

http://hdl.lib.byu.edu/1877/etd1371

Keywords

Hidden Markov model, dynamic system, aggregation, lumping, realization, approximation, minimal system

Language

English

Share

COinS