Learning From Non-iid Data: Fast Rates for the One-vs-All Multiclass Plug-in Classifiers
Dinh, Vu, Ho, Lam Si Tung, Cuong, Nguyen Viet, Nguyen, Duy, Nguyen, Binh T.
We prove new fast learning rates for the one-vs-all multiclass plug-in classifiers trained either from exponentially strongly mixing data or from data generated by a converging drifting distribution. These are two typical scenarios where training data are not iid. The learning rates are obtained under a multiclass version of Tsybakov's margin assumption, a type of low-noise assumption, and do not depend on the number of classes. Our results are general and include a previous result for binary-class plug-in classifiers with iid data as a special case. In contrast to previous works for least squares SVMs under the binary-class setting, our results retain the optimal learning rate in the iid case.
Jan-24-2015
- Country:
- North America > United States
- California (0.14)
- Wisconsin (0.14)
- North America > United States
- Genre:
- Research Report > New Finding (0.55)
- Technology: