Inference in Hidden Markov Models with Explicit State Duration Distributions

Dewar, Michael, Wiggins, Chris, Wood, Frank

arXiv.org Machine Learning 

Hidden Markov models (HMMs) are a fundamental tool for data analysis and exploration. Many variants of the basic HMM have been developed in response to shortcomings in the original HMM formulation [9]. In this paper we address inference in the explicit state duration HMM (EDHMM). By state duration we mean the amount of time an HMM dwells in a state. In the standard HMM specification, a state's duration is implicit and, a priori, distributed geometrically. The EDHMM (or, equivalently, the hidden semi-Markov model [12]) was developed to allow explicit parameterization and direct inference of state duration distributions. EDHMM estimation and inference can be performed using the forward-backward algorithm; though only if the sequence is short or a tight "allowable" duration interval for each state is hard-coded a priori [13]. If the sequence is short then forward-backward can be run on a state representation that allows for all possible durations up to the observed sequence length. If the sequence is long then forward-backward only remains computationally tractable if only transitions between durations that lie within pre-specified allowable intervals are considered.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found