Self-Normalizing Neural Networks
Klambauer, Günter, Unterthiner, Thomas, Mayr, Andreas, Hochreiter, Sepp
–Neural Information Processing Systems
Deep Learning has revolutionized vision via convolutional neural networks (CNNs) and natural language processing via recurrent neural networks (RNNs). However, success stories of Deep Learning with standard feed-forward neural networks (FNNs) are rare. FNNs that perform well are typically shallow and, therefore cannot exploit many levels of abstract representations. We introduce self-normalizing neural networks (SNNs) to enable high-level abstract representations. While batch normalization requires explicit normalization, neuron activations of SNNs automatically converge towards zero mean and unit variance.
Neural Information Processing Systems
Feb-14-2020, 06:41:59 GMT
- Technology: