Mode Normalization
Deecke, Lucas, Murray, Iain, Bilen, Hakan
Normalization methods are a central building block in the deep learning toolbox. They accelerate and stabilize training, while decreasing the dependence on manually tuned learning rate schedules. When learning from multi-modal distributions, the effectiveness of batch normalization (BN), arguably the most prominent normalization method, is reduced. As a remedy, we propose a more flexible approach: by extending the normalization to more than a single mean and variance, we detect modes of data on-the-fly, jointly normalizing samples that share common features. We demonstrate that our method outperforms BN and other widely used normalization techniques in several experiments, including single and multi-task datasets.
Oct-12-2018
- Country:
- North America > Canada > Ontario > Toronto (0.14)
- Genre:
- Research Report > New Finding (0.68)
- Technology: