Goto

Collaborating Authors

 Cong Han Lim


k-Support and Ordered Weighted Sparsity for Overlapping Groups: Hardness and Algorithms

Neural Information Processing Systems

We study the norms obtained from extending the k-support norm and OWL norms to the setting in which there are overlapping groups. The resulting norms are in general NP-hard to compute, but they are tractable for certain collections of groups. To demonstrate this fact, we develop a dynamic program for the problem of projecting onto the set of vectors supported by a fixed number of groups.


An Efficient Pruning Algorithm for Robust Isotonic Regression

Neural Information Processing Systems

We study a generalization of the classic isotonic regression problem where we allow separable nonconvex objective functions, focusing on the case where the functions are estimators used in robust regression. One can solve this problem to within ɛ-accuracy (of the global minimum) in O(n/ɛ) using a simple dynamic program, and the complexity of this approach is independent of the underlying functions. We introduce an algorithm that combines techniques from the convex case with branchand-bound ideas that is able to exploit the shape of the functions. Our algorithm achieves the best known bounds for both the convex case (O(n log(1/ɛ))) and the general nonconvex case. Experiments show that this algorithm can perform much faster than the dynamic programming approach on robust estimators, especially as the desired accuracy increases.


An Efficient Pruning Algorithm for Robust Isotonic Regression

Neural Information Processing Systems

We study a generalization of the classic isotonic regression problem where we allow separable nonconvex objective functions, focusing on the case where the functions are estimators used in robust regression. One can solve this problem to within ɛ-accuracy (of the global minimum) in O(n/ɛ) using a simple dynamic program, and the complexity of this approach is independent of the underlying functions. We introduce an algorithm that combines techniques from the convex case with branchand-bound ideas that is able to exploit the shape of the functions. Our algorithm achieves the best known bounds for both the convex case (O(n log(1/ɛ))) and the general nonconvex case. Experiments show that this algorithm can perform much faster than the dynamic programming approach on robust estimators, especially as the desired accuracy increases.


k-Support and Ordered Weighted Sparsity for Overlapping Groups: Hardness and Algorithms

Neural Information Processing Systems

We study the norms obtained from extending the k-support norm and OWL norms to the setting in which there are overlapping groups. The resulting norms are in general NP-hard to compute, but they are tractable for certain collections of groups. To demonstrate this fact, we develop a dynamic program for the problem of projecting onto the set of vectors supported by a fixed number of groups.