Reviews: Fast structure learning with modular regularization

Neural Information Processing Systems 

The manuscript proposes a new objective function for learning Gaussian latent factor models. The objective function is based on information-theoretic characterization of modular latent factor models, where the model attains optimal value. The derivation of the objective function carefully avoids matrix inversion to improve computational complexity compared to traditional methods. The authors pointed out that the proposed model enjoys'blessing of dimension' in that model performance improves when the dimension of observable variables increases while the dimension of latent variables remains constant. This is demonstrated by both simulation and an information-theoretic lower bound on the sample size.