Goto

Collaborating Authors

 proxygml



Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer Proxies

Neural Information Processing Systems

Deep metric learning plays a key role in various machine learning tasks. Most of the previous works have been confined to sampling from a mini-batch, which cannot precisely characterize the global geometry of the embedding space. Although researchers have developed proxy-and classification-based methods to tackle the sampling issue, those methods inevitably incur a redundant computational cost. In this paper, we propose a novel Proxy-based deep Graph Metric Learning (ProxyGML) approach from the perspective of graph classification, which uses fewer proxies yet achieves better comprehensive performance. Specifically, multiple global proxies are leveraged to collectively approximate the original data points for each class. To efficiently capture local neighbor relationships, a small number of such proxies are adaptively selected to construct similarity subgraphs between these proxies and each data point. Further, we design a novel reverse label propagation algorithm, by which the neighbor relationships are adjusted according to ground-truth labels, so that a discriminative metric space can be learned during the process of subgraph classification. Extensive experiments carried out on widely-used CUB-200-2011, Cars196, and Stanford Online Products datasets demonstrate the superiority of the proposed ProxyGML over the state-of-the-art methods in terms of both effectiveness and efficiency.


Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer Proxies

Neural Information Processing Systems

Deep metric learning plays a key role in various machine learning tasks. Most of the previous works have been confined to sampling from a mini-batch, which cannot precisely characterize the global geometry of the embedding space. Although researchers have developed proxy-and classification-based methods to tackle the sampling issue, those methods inevitably incur a redundant computational cost.



ce016f59ecc2366a43e1c96a4774d167-Supplemental.pdf

Neural Information Processing Systems

Supplementary Material for "Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer Proxies" Figure 1: Recall@1 values of ProxyGML on Cars196 with different combinations of N and r . Please note that this is only a preliminary experiment. Now we consider two special cases. In this case, negative elements in the prediction scores will all be zeros ( cf .


Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer Proxies

Neural Information Processing Systems

Deep metric learning plays a key role in various machine learning tasks. Most of the previous works have been confined to sampling from a mini-batch, which cannot precisely characterize the global geometry of the embedding space. Although researchers have developed proxy-and classification-based methods to tackle the sampling issue, those methods inevitably incur a redundant computational cost.


ce016f59ecc2366a43e1c96a4774d167-AuthorFeedback.pdf

Neural Information Processing Systems

We thank the reviewers for their valuable comments and recognition of the novelty and results of our method, e . We respond to the major comments below but will address all feedback in our revised version. Proxies are globally learnable "cluster centers" while Clustering [13] directly regards There are actually two types of constraints among proxies in our method, i . "soft" constraint, by encouraging proxies to be close to their anchor samples ( In practice, similar proxies tend to be sufficiently close to each other in the later training stage. Eq. (5)) proxies for each sample during back-propagation, and we use a small batch size As future work, we will focus more on addressing such datasets with huge inter-class variance.


Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer Proxies

Neural Information Processing Systems

Deep metric learning plays a key role in various machine learning tasks. Most of the previous works have been confined to sampling from a mini-batch, which cannot precisely characterize the global geometry of the embedding space. Although researchers have developed proxy- and classification-based methods to tackle the sampling issue, those methods inevitably incur a redundant computational cost. In this paper, we propose a novel Proxy-based deep Graph Metric Learning (ProxyGML) approach from the perspective of graph classification, which uses fewer proxies yet achieves better comprehensive performance. Specifically, multiple global proxies are leveraged to collectively approximate the original data points for each class.