Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms
Blondel, Mathieu, Martins, André F. T., Niculae, Vlad
We study in this paper Fenchel-Young losses, a generic way to construct convex loss functions from a convex regularizer. We provide an in-depth study of their properties in a broad setting and show that they unify many well-known loss functions. When constructed from a generalized entropy, which includes well-known entropies such as Shannon and Tsallis entropies, we show that Fenchel-Young losses induce a predictive probability distribution and develop an efficient algorithm to compute that distribution for separable entropies. We derive conditions for generalized entropies to yield a distribution with sparse support and losses with a separation margin. Finally, we present both primal and dual algorithms to learn predictive models with generic Fenchel-Young losses.
May-24-2018
- Country:
- Asia
- Japan > Honshū
- Kansai > Kyoto Prefecture > Kyoto (0.04)
- Middle East > Jordan (0.04)
- Russia (0.04)
- Japan > Honshū
- Europe
- Portugal > Lisbon
- Lisbon (0.04)
- Russia (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Oxfordshire > Oxford (0.04)
- Portugal > Lisbon
- North America > United States
- California (0.04)
- Illinois > Champaign County
- Urbana (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- New York > Tompkins County
- Ithaca (0.04)
- Pennsylvania (0.04)
- Asia
- Genre:
- Research Report (0.82)
- Technology: