Goto

Collaborating Authors

 Computational Learning Theory




Derandomizing Multi-Distribution Learning

Neural Information Processing Systems

Multi-distribution or collaborative learning involves learning a single predictor that works well across multiple data distributions, using samples from each during training.


a8808b75b299d64a23255bc8d30fb786-Paper-Conference.pdf

Neural Information Processing Systems

Can a physicist make only a finite number of errors in the eternal quest to uncover the law of nature? This millennium-old philosophical problem, known as inductive inference, lies at the heart of epistemology.





The Bayesian Stability Zoo

Neural Information Processing Systems

Algorithmic stability is a major theme in learning theory, where seminal results have firmly established its close relationship with generalization. Recent research has further highlighted the intricate interplay between stability and additional properties of interest beyond statistical generalization.