Algebraic Information Geometry for Learning Machines with Singularities

Neural Information Processing Systems 

Algebraic geometry is essential to learning theory. In hierarchical learning machines such as layered neural networks and gaussian mixtures, the asymptotic normality does not hold, since Fisher in(cid:173) formation matrices are singular. In this paper, the rigorous asymp(cid:173) totic form of the stochastic complexity is clarified based on resolu(cid:173) tion of singularities and two different problems are studied. It is useful for model selection, but not for generalization.