Goto

Collaborating Authors

 withhighprobability


LiftingWeakSupervisionToStructuredPrediction

Neural Information Processing Systems

For labels taking values in a finite metric space, we introduce techniques new to weak supervision based on pseudo-Euclidean embeddings andtensor decompositions, providing anearly-consistent noise rate estimator.



24389bfe4fe2eba8bf9aa9203a44cdad-Paper.pdf

Neural Information Processing Systems

Wealso provide provable error bounds fordifferent norms forreconstructing noisy observations. Our empirical validation demonstrates that we obtain better reconstructions when the latent dimensionislarge.



Firstorderexpansionofconvexregularized estimators

Neural Information Processing Systems

Such first order expansion implies that the risk ofˆβ is asymptotically the same as the risk ofη which leads to a precise characterization of the MSE ofˆβ; this characterization takes aparticularly simple form for isotropic design. Such first order expansion also leads to inference results based onˆβ. We provide sufficient conditions for theexistence ofsuch first order expansion forthree regularizers: theLasso inits constrainedform,thelassoinitspenalizedform,andtheGroup-Lasso.Theresults apply to general loss functions under some conditions and those conditions are satisfied for the squared loss in linear regression and for the logistic loss in the logisticmodel.




ac73001b1d44f4925449ce09d9f5d5ca-Paper.pdf

Neural Information Processing Systems

For deterministic feedback, we additionally present a gap-independent algorithm that identifies a Condorcet winning team withinO(nklog(k)+k5)duels.



AComprehensiveAnalysisontheLearningCurve inKernelRidgeRegression

Neural Information Processing Systems

Kernel ridge regression (KRR) is a central tool in machine learning due to its ability to provide a flexible and efficient framework for capturing intricate patterns within data.