Phase transitions in Restricted Boltzmann Machines with generic priors

Barra, Adriano, Genovese, Giuseppe, Sollich, Peter, Tantari, Daniele

arXiv.org Machine Learning 

We present a complete analysis of the replica symmetric phase diagram of these systems, which can be regarded as Generalised Hopfield models. We underline the role of the retrieval phase for both inference and learning processes and we show that retrieval is robust for a large class of weight and unit priors, beyond the standard Hopfield scenario. Furthermore we show how the paramagnetic phase boundary is directly related to the optimal size of the training set necessary for good generalisation in a teacher-student scenario of unsupervised learning. In recent years supervised machine learning with neural networks has found renewed interest from the practical success of so-called deep networks in solving several difficult problems, ranging from image classification to speech recognition and video segmentation [1]. Despite this remarkable progress, unsupervised learning with neural networks, in which the structure of data is learned without a priori knowledge of a specific task, still lacks a solid theoretical scaffold. Such learning of hidden features of complex data in high dimensional spaces by fitting a generative probabilistic model is used for de-noising, completion and data generation, but also as a dimensionality reduction pre-training step in supervised methods [7, 8].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found