Online normalizer calculation for softmax
Milakov, Maxim, Gimelshein, Natalia
–arXiv.org Artificial Intelligence
The Softmax function is ubiquitous in machine learning, multiple previous works suggested faster alternatives for it. In this paper we propose a way to compute classical Softmax with fewer memory accesses and hypothesize that this reduction in memory accesses should improve Softmax performance on actual hardware. The benchmarks confirm this hypothesis: Softmax accelerates by up to 1.3x and Softmax+TopK combined by up to 5x.
arXiv.org Artificial Intelligence
May-8-2018