Goto

Collaborating Authors

 rethinking binarized neural network optimization


Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization

Neural Information Processing Systems

Optimization of Binarized Neural Networks (BNNs) currently relies on real-valued latent weights to accumulate small update steps. In this paper, we argue that these latent weights cannot be treated analogously to weights in real-valued networks. Instead their main role is to provide inertia during training. We interpret current methods in terms of inertia and provide novel insights into the optimization of BNNs. We subsequently introduce the first optimizer specifically designed for BNNs, Binary Optimizer (Bop), and demonstrate its performance on CIFAR-10 and ImageNet. Together, the redefinition of latent weights as inertia and the introduction of Bop enable a better understanding of BNN optimization and open up the way for further improvements in training methodologies for BNNs.


Reviews: Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization

Neural Information Processing Systems

This paper addresses the optimization for BNN and provides a novel latent-free optimizer for BNN, which challenges the existing way of using later-weights. This is an interesting and original idea. Specifically, one common way to see BNN training is to view the binary weights as an approximation to real-valued weight vector, this paper argues that the latent weights used in the previous methods are in fact not weights. The paper argues this by introducing a concept of inertia. Motivated from this new insight, one novel optimizer called Bop is introduced.

  latent weight, optimizer, rethinking binarized neural network optimization, (6 more...)

Reviews: Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization

Neural Information Processing Systems

This paper proposed a new training method for neural networks with binary weights. The main idea is to not use the existing "latent weights approach" which treats the weights as continuous, rather a new method that relies on the sign of the weights. The proposed approach is based on momentum. Before rebuttal, the authors found the paper to be original, novel, and also simpler than existing methods. They had some concerns regarding the experiments and also a few other small concerns.

  latent weight, rebuttal, rethinking binarized neural network optimization

Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization

Neural Information Processing Systems

Optimization of Binarized Neural Networks (BNNs) currently relies on real-valued latent weights to accumulate small update steps. In this paper, we argue that these latent weights cannot be treated analogously to weights in real-valued networks. Instead their main role is to provide inertia during training. We interpret current methods in terms of inertia and provide novel insights into the optimization of BNNs. We subsequently introduce the first optimizer specifically designed for BNNs, Binary Optimizer (Bop), and demonstrate its performance on CIFAR-10 and ImageNet.

  bnn, latent weight, rethinking binarized neural network optimization

Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization

Helwegen, Koen, Widdicombe, James, Geiger, Lukas, Liu, Zechun, Cheng, Kwang-Ting, Nusselder, Roeland

Neural Information Processing Systems

Optimization of Binarized Neural Networks (BNNs) currently relies on real-valued latent weights to accumulate small update steps. In this paper, we argue that these latent weights cannot be treated analogously to weights in real-valued networks. Instead their main role is to provide inertia during training. We interpret current methods in terms of inertia and provide novel insights into the optimization of BNNs. We subsequently introduce the first optimizer specifically designed for BNNs, Binary Optimizer (Bop), and demonstrate its performance on CIFAR-10 and ImageNet.