convex atomic-norm regularization
MixLasso: Generalized Mixed Regression via Convex Atomic-Norm Regularization
We consider a generalization of mixed regression where the response is an additive combination of several mixture components. Standard mixed regression is a special case where each response is generated from exactly one component. Typical approaches to the mixture regression problem employ local search methods such as Expectation Maximization (EM) that are prone to spurious local optima. On the other hand, a number of recent theoretically-motivated \emph{Tensor-based methods} either have high sample complexity, or require the knowledge of the input distribution, which is not available in most of practical situations. In this work, we study a novel convex estimator \emph{MixLasso} for the estimation of generalized mixed regression, based on an atomic norm specifically constructed to regularize the number of mixture components. Our algorithm gives a risk bound that trades off between prediction accuracy and model sparsity without imposing stringent assumptions on the input/output distribution, and can be easily adapted to the case of non-linear functions. In our numerical experiments on mixtures of linear as well as nonlinear regressions, the proposed method yields high-quality solutions in a wider range of settings than existing approaches.
MixLasso: Generalized Mixed Regression via Convex Atomic-Norm Regularization
We consider a generalization of mixed regression where the response is an additive combination of several mixture components. Standard mixed regression is a special case where each response is generated from exactly one component. Typical approaches to the mixture regression problem employ local search methods such as Expectation Maximization (EM) that are prone to spurious local optima. On the other hand, a number of recent theoretically-motivated \emph{Tensor-based methods} either have high sample complexity, or require the knowledge of the input distribution, which is not available in most of practical situations. In this work, we study a novel convex estimator \emph{MixLasso} for the estimation of generalized mixed regression, based on an atomic norm specifically constructed to regularize the number of mixture components.
Reviews: MixLasso: Generalized Mixed Regression via Convex Atomic-Norm Regularization
This paper considers a generalized version of the mixed regression problem, where we observe a collection of input-output samples, with the output corresponding to an additive combination of several mixture components/functions, and the goal is to find a collection of K functions that minimise the risk. The corresponding ERM problem is NP-Hard to solve due to combinatorial constraints. The authors propose to relax these constraints by replacing them with an atomic norm regularizer they introduce as an "approximation" of the number of components. They propose to solve the resulting convex problem using a greedy algorithm. Their analysis show that the solutions obtained by their approach achieve epsilon-optimal risk using a linear number of samples (both in terms of K and the dimension D) and O(K/epsilon) number of components, thus improving over the state-of-the-art in terms of number of sample complexity.
MixLasso: Generalized Mixed Regression via Convex Atomic-Norm Regularization
Yen, Ian En-Hsu, Lee, Wei-Cheng, Zhong, Kai, Chang, Sung-En, Ravikumar, Pradeep K., Lin, Shou-De
We consider a generalization of mixed regression where the response is an additive combination of several mixture components. Standard mixed regression is a special case where each response is generated from exactly one component. Typical approaches to the mixture regression problem employ local search methods such as Expectation Maximization (EM) that are prone to spurious local optima. On the other hand, a number of recent theoretically-motivated \emph{Tensor-based methods} either have high sample complexity, or require the knowledge of the input distribution, which is not available in most of practical situations. In this work, we study a novel convex estimator \emph{MixLasso} for the estimation of generalized mixed regression, based on an atomic norm specifically constructed to regularize the number of mixture components. Our algorithm gives a risk bound that trades off between prediction accuracy and model sparsity without imposing stringent assumptions on the input/output distribution, and can be easily adapted to the case of non-linear functions.