A Study on Binary Neural Networks Initialization

Sari, Eyyüb, Belbahri, Mouloud, Nia, Vahid Partovi

arXiv.org Machine Learning 

Initialization plays a crucial role in training neural models. Binary Neural Networks (BNNs) is the most extreme quantization which often suffers from drop of accuracy. Most of neural network initialization is studied in full-prevision network setting, in which the variance of the random initialization decreases with the number of parameters per layer. We show that contrary to common belief, such popular initialization schemes are meaningless to BNNs. We analyze binary networks analytically, and propose to initialize binary weights with the same variance across different layers. We perform experiments to show the accuracy gain using this straight-forward heuristic.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found