Goto

Collaborating Authors

 efficient second order online learning


Efficient Second Order Online Learning by Sketching

Neural Information Processing Systems

We propose Sketched Online Newton (SON), an online second order learning algorithm that enjoys substantially improved regret guarantees for ill-conditioned data. SON is an enhanced version of the Online Newton Step, which, via sketching techniques enjoys a running time linear in the dimension and sketch size. We further develop sparse forms of the sketching methods (such as Oja's rule), making the computation linear in the sparsity of features. Together, the algorithm eliminates all computational obstacles in previous second order online learning approaches.

  efficient second order online learning, name change, sketching, (2 more...)
  Industry:

Reviews: Efficient Second Order Online Learning by Sketching

Neural Information Processing Systems

The present work takes a significant step in addressing this. The primary contribution of the paper are variations of Online Newton Step that remove this drawback using a sketching approximation to the scaling matrix and a clever implementation of sparse updates. The primary theoretical contributions are the analysis of the RP and FD versions of the algorithm. For RP they show a regret bound which holds when the matrix G_T (the matrix of observed gradients) is actually low-rank. Given the structure of the loss functions assumed, f_t(w) \ell( w, x_t), gradients will always be in the direction of the examples x_t, and so I think this theorem only holds when the data is actually low-rank.


Efficient Second Order Online Learning by Sketching Haipeng Luo

Neural Information Processing Systems

We propose Sketched Online Newton (SON), an online second order learning algorithm that enjoys substantially improved regret guarantees for ill-conditioned data. SON is an enhanced version of the Online Newton Step, which, via sketching techniques enjoys a running time linear in the dimension and sketch size. We further develop sparse forms of the sketching methods (such as Oja's rule), making the computation linear in the sparsity of features. Together, the algorithm eliminates all computational obstacles in previous second order online learning approaches.


Efficient Second Order Online Learning by Sketching

Luo, Haipeng, Agarwal, Alekh, Cesa-Bianchi, Nicolò, Langford, John

Neural Information Processing Systems

We propose Sketched Online Newton (SON), an online second order learning algorithm that enjoys substantially improved regret guarantees for ill-conditioned data. SON is an enhanced version of the Online Newton Step, which, via sketching techniques enjoys a running time linear in the dimension and sketch size. We further develop sparse forms of the sketching methods (such as Oja's rule), making the computation linear in the sparsity of features. Together, the algorithm eliminates all computational obstacles in previous second order online learning approaches. Papers published at the Neural Information Processing Systems Conference.


Efficient Second Order Online Learning by Sketching

Luo, Haipeng, Agarwal, Alekh, Cesa-Bianchi, Nicolò, Langford, John

Neural Information Processing Systems

We propose Sketched Online Newton (SON), an online second order learning algorithm that enjoys substantially improved regret guarantees for ill-conditioned data. SON is an enhanced version of the Online Newton Step, which, via sketching techniques enjoys a running time linear in the dimension and sketch size. We further develop sparse forms of the sketching methods (such as Oja's rule), making the computation linear in the sparsity of features. Together, the algorithm eliminates all computational obstacles in previous second order online learning approaches.