Swapout: Learning an ensemble of deep architectures
Singh, Saurabh, Hoiem, Derek, Forsyth, David
–Neural Information Processing Systems
We describe Swapout, a new stochastic training method, that outperforms ResNets of identical network structure yielding impressive results on CIFAR-10 and CIFAR-100. When viewed as a regularization method swapout not only inhibits co-adaptation of units in a layer, similar to dropout, but also across network layers. We conjecture that swapout achieves strong regularization by implicitly tying the parameters across layers. When viewed as an ensemble training method, it samples a much richer set of architectures than existing methods such as dropout or stochastic depth. We propose a parameterization that reveals connections to exiting architectures and suggests a much richer set of architectures to be explored.
Neural Information Processing Systems
Feb-14-2020, 04:56:00 GMT
- Technology: