Goto

Collaborating Authors

 multiclass logistic regression


Mixability made efficient: Fast online multiclass logistic regression

Neural Information Processing Systems

However, the resulting methods often suffer from high computational complexity which has reduced their practical applicability. For example, in the case of multiclass logistic regression, the aggregating forecaster (Foster et al. (2018)) achieves a regret of


Mixability made efficient: Fast online multiclass logistic regression

Neural Information Processing Systems

However, the resulting methods often suffer from high computational complexity which has reduced their practical applicability. For example, in the case of multiclass logistic regression, the aggregating forecaster (Foster et al. (2018)) achieves a regret of


Mixability made efficient: Fast online multiclass logistic regression

Neural Information Processing Systems

However, the resulting methods often suffer from high computational complexity which has reduced their practical applicability. For example, in the case of multiclass logistic regression, the aggregating forecaster (Foster et al. (2018)) achieves a regret of


Mixability made efficient: Fast online multiclass logistic regression

Neural Information Processing Systems

Mixability has been shown to be a powerful tool to obtain algorithms with optimal regret. However, the resulting methods often suffer from high computational complexity which has reduced their practical applicability. For example, in the case of multiclass logistic regression, the aggregating forecaster (see Foster et al. 2018) achieves a regret of O(\log(Bn)) whereas Online Newton Step achieves O(e B\log(n)) obtaining a double exponential gain in B (a bound on the norm of comparative functions). However, this high statistical performance is at the price of a prohibitive computational complexity O(n {37}) .In this paper, we use quadratic surrogates to make aggregating forecasters more efficient. We show that the resulting algorithm has still high statistical performance for a large class of losses. In particular, we derive an algorithm for multiclass regression with a regret bounded by O(B\log(n)) and computational complexity of only O(n 4) .


Efficient Methods for Online Multiclass Logistic Regression

Agarwal, Naman, Kale, Satyen, Zimmert, Julian

arXiv.org Machine Learning

Multiclass logistic regression is a fundamental task in machine learning with applications in classification and boosting. Previous work (Foster et al., 2018) has highlighted the importance of improper predictors for achieving "fast rates" in the online multiclass logistic regression problem without suffering exponentially from secondary problem parameters, such as the norm of the predictors in the comparison class. While Foster et al. (2018) introduced a statistically optimal algorithm, it is in practice computationally intractable due to its run-time complexity being a large polynomial in the time horizon and dimension of input feature vectors. In this paper, we develop a new algorithm, FOLKLORE, for the problem which runs significantly faster than the algorithm of Foster et al. (2018) - the running time per iteration scales quadratically in the dimension - at the cost of a linear dependence on the norm of the predictors in the regret bound. This yields the first practical algorithm for online multiclass logistic regression, resolving an open problem of Foster et al. (2018). Furthermore, we show that our algorithm can be applied to online bandit multiclass prediction and online multiclass boosting, yielding more practical algorithms for both problems compared to the ones in (Foster et al., 2018) with similar performance guarantees. Finally, we also provide an online-to-batch conversion result for our algorithm.


Mixability made efficient: Fast online multiclass logistic regression

Jézéquel, Rémi, Gaillard, Pierre, Rudi, Alessandro

arXiv.org Machine Learning

Mixability has been shown to be a powerful tool to obtain algorithms with optimal regret. However, the resulting methods often suffer from high computational complexity which has reduced their practical applicability. For example, in the case of multiclass logistic regression, the aggregating forecaster (Foster et al. (2018)) achieves a regret of $O(\log(Bn))$ whereas Online Newton Step achieves $O(e^B\log(n))$ obtaining a double exponential gain in $B$ (a bound on the norm of comparative functions). However, this high statistical performance is at the price of a prohibitive computational complexity $O(n^{37})$.


A Hidden Variables Approach to Multilabel Logistic Regression

Lee, Jaemoon, Shajari, Hoda

arXiv.org Machine Learning

Multilabel classification is an important problem in a wide range of domains such as text categorization and music annotation. In this paper, we present a probabilistic model, Multilabel Logistic Regression with Hidden variables (MLRH), which extends the standard logistic regression by introducing hidden variables. Hidden variables make it possible to go beyond the conventional multiclass logistic regression by relaxing the one-hot-encoding constraint. We define a new joint distribution of labels and hidden variables which enables us to obtain one classifier for multilabel classification. Our experimental studies on a set of benchmark datasets demonstrate that the probabilistic model can achieve competitive performance compared with other multilabel learning algorithms.