RKL: a general, invariant Bayes solution for Neyman-Scott

Brand, Michael

arXiv.org Machine Learning 

Neyman-Scott is a classic example of an estimation problem with a partially-consistent posterior, for which standard estimation methods tend to produce inconsistent results. Past attempts to create consistent estimators for Neyman-Scott have led to ad-hoc solutions, to estimators that do not satisfy representation invariance, to restrictions over the choice of prior and more. We present a simple construction for a general-purpose Bayes estimator, invariant to representation, which satisfies consistency on Neyman-Scott over any nondegenerate prior. We argue that the good attributes of the estimator are due to its intrinsic properties, and generalise beyond Neyman-Scott as well. Keywords: Neyman-Scott, consistent estimation, minEKL, Kullback-Leibler, Bayes estimation, invariance 1. Introduction In [24], Neyman and Scott introduced a problem in consistent estimation that has since been studied extensively in many fields (see [18] for a review).

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found