Higher-order asymptotics for the parametric complexity
The minimum description length (MDL) principle provides a general information-theoretic approach to model selection and other forms of statistical inference [5, 17]. The MDL criterion for model selection is consistent, meaning that it will select the data-generating model from a countable set of competing parametric models with probability approaching 1 as the sample size n goes to infinity [4]. For example, if each of the parametric models is a logistic regression model with predictor variables taken from a fixed set of potential predictors, then the MDL model-selection criterion will choose the correct combination of predictors with probability approaching 1 as n . The MDL model-selection criterion also has a number of strong optimality properties, which greatly extend Shannon's noiseless coding theorem [5, §III.E]. In its simplest form, the MDL principle advocates choosing the model for which the observed data has the shortest message length under a particular prefix code defined by a minimax condition [11, §2.4.3]. Shtarkov [19] showed that this is equivalent to choosing the model with the largest normalized maximum likelihood (NML) for the observed data.
Oct-29-2015
- Country:
- Europe > Switzerland
- Basel-City > Basel (0.04)
- North America > United States
- Europe > Switzerland
- Genre:
- Research Report
- Experimental Study (0.34)
- New Finding (0.34)
- Research Report
- Technology: