Goto

Collaborating Authors

 nullr 1


The TAP free energy for high-dimensional linear regression

Qiu, Jiaze, Sen, Subhabrata

arXiv.org Machine Learning

The analysis of high-dimensional probability distributio ns is a central challenge in modern Statistics and Machine Learning. This i s particularly true in the context of Bayesian Statistics, where scientists carry out inferen ce based on the posterior distribution. In modern applications, the posterior distribution is typi cally high-dimensional, and analytically intractable. V ariational Inference (VI) has emerge d as an attractive option to approximate these intractable distributions, facilitating fast, parallel computations in state-of-the-art applications [ 32, 10 ]. In this approach, the distribution of interest is approxi mated (in KL divergence) by distributions from a pre-specified, more tract able collection. The simplest version of VI is the Naive Mean-field approximation (NMF), where the distribution of interest is approximated by a product distribution.


Dual IV: A Single Stage Instrumental Variable Regression

Muandet, Krikamol, Mehrjou, Arash, Lee, Si Kai, Raj, Anant

arXiv.org Machine Learning

We present a novel single-stage procedure for instrumental variable (IV) regression called DualIV which simplifies traditional two-stage regression via a dual formulation. We show that the common two-stage procedure can alternatively be solved via generalized least squares. Our formulation circumvents the first-stage regression which can be a bottleneck in modern two-stage procedures for IV regression. We also show that our framework is closely related to the generalized method of moments (GMM) with specific assumptions. This highlights the fundamental connection between GMM and two-stage procedures in IV literature. Using the proposed framework, we develop a simple kernel-based algorithm with consistency guarantees. Lastly, we give empirical results illustrating the advantages of our method over the existing two-stage algorithms.