learning compact binary descriptor
BinGAN: Learning Compact Binary Descriptors with a Regularized GAN
In this paper, we propose a novel regularization method for Generative Adversarial Networks that allows the model to learn discriminative yet compact binary representations of image patches (image descriptors). We exploit the dimensionality reduction that takes place in the intermediate layers of the discriminator network and train the binarized penultimate layer's low-dimensional representation to mimic the distribution of the higher-dimensional preceding layers. To achieve this, we introduce two loss terms that aim at: (i) reducing the correlation between the dimensions of the binarized penultimate layer's low-dimensional representation (i.e.
Reviews: BinGAN: Learning Compact Binary Descriptors with a Regularized GAN
Summary This paper proposes a variant of GAN to learn compact binary descriptors for image patch matching. The authors introduce two novel regularizers to propagate Hamming distance between two layers in the discriminator and encourage the diversity of learned descriptors. The presentation is easy to follow, and the method is validated by benchmark datasets. Major concerns: [Motivation and Presentation] First of all, it's not so clear the reason why adversarial training helps to learn compact binary descriptors. In addition, the motivation on DMR is also not fully addressed in my sense. In my understanding, the discriminator has two binary representation layers; one of them has the larger number of bits and the other is used for the compact binary descriptor.
BinGAN: Learning Compact Binary Descriptors with a Regularized GAN
Zieba, Maciej, Semberecki, Piotr, El-Gaaly, Tarek, Trzcinski, Tomasz
In this paper, we propose a novel regularization method for Generative Adversarial Networks that allows the model to learn discriminative yet compact binary representations of image patches (image descriptors). We exploit the dimensionality reduction that takes place in the intermediate layers of the discriminator network and train the binarized penultimate layer's low-dimensional representation to mimic the distribution of the higher-dimensional preceding layers. To achieve this, we introduce two loss terms that aim at: (i) reducing the correlation between the dimensions of the binarized penultimate layer's low-dimensional representation (i.e. We evaluate the resulting binary image descriptors on two challenging applications, image matching and retrieval, where they achieve state-of-the-art results. Papers published at the Neural Information Processing Systems Conference.