Online Normalization for Training Neural Networks
Chiley, Vitaliy, Sharapov, Ilya, Kosson, Atli, Koster, Urs, Reece, Ryan, de la Fuente, Sofia Samaniego, Subbiah, Vishal, James, Michael
Online Normalization is a new technique for normalizing the hidden activations of a neural network. Like Batch Normalization, it normalizes the sample dimension. While Online Normalization does not use batches, it is as accurate as Batch Normalization. We resolve a theoretical limitation of Batch Normalization by introducing an unbiased technique for computing the gradient of normalized activations. Online Normalization works with automatic differentiation by adding statistical normalization as a primitive. This technique can be used in cases not covered by some other normalizers, such as recurrent networks, fully connected networks, and networks with activation memory requirements prohibitive for batching. We show its applications to image classification, image segmentation, and language modeling. We present formal proofs and experimental results on ImageNet, CIFAR, and PTB datasets.
May-28-2019
- Country:
- Europe > Italy
- Calabria > Catanzaro Province > Catanzaro (0.04)
- North America
- Canada > Ontario
- Toronto (0.04)
- United States
- California
- San Francisco County > San Francisco (0.14)
- Santa Clara County > Los Altos (0.04)
- Nevada > Clark County
- Las Vegas (0.04)
- New York
- Bronx County > New York City (0.04)
- Kings County > New York City (0.04)
- New York County > New York City (0.04)
- Queens County > New York City (0.04)
- Richmond County > New York City (0.04)
- California
- Canada > Ontario
- Europe > Italy
- Genre:
- Research Report > New Finding (0.67)
- Technology: