How to Initialize your Network? Robust Initialization for WeightNorm & ResNets
Arpit, Devansh, Campos, Víctor, Bengio, Yoshua
–Neural Information Processing Systems
Residual networks (ResNet) and weight normalization play an important role in various deep learning applications. However, parameter initialization strategies have not been studied previously for weight normalized networks and, in practice, initialization methods designed for un-normalized networks are used as a proxy. Similarly, initialization for ResNets have also been studied for un-normalized networks and often under simplified settings ignoring the shortcut connection. To address these issues, we propose a novel parameter initialization strategy that avoids explosion/vanishment of information across layers for weight normalized networks with and without residual connections. The proposed strategy is based on a theoretical analysis using mean field approximation.
Neural Information Processing Systems
Mar-19-2020, 01:02:46 GMT
- Technology: