A Convergent Single-Loop Algorithm for Relaxation of Gromov-Wasserstein in Graph Data
Li, Jiajin, Tang, Jianheng, Kong, Lemin, Liu, Huikang, Li, Jia, So, Anthony Man-Cho, Blanchet, Jose
–arXiv.org Artificial Intelligence
In this work, we present the Bregman Alternating Projected Gradient (BAPG) method, a single-loop algorithm that offers an approximate solution to the Gromov-Wasserstein (GW) distance. We introduce a novel relaxation technique that balances accuracy and computational efficiency, albeit with some compromises in the feasibility of the coupling map. Our analysis is based on the observation that the GW problem satisfies the Luo-Tseng error bound condition, which relates to estimating the distance of a point to the critical point set of the GW problem based on the optimality residual. This observation allows us to provide an approximation bound for the distance between the fixed-point set of BAPG and the critical point set of GW. Moreover, under a mild technical assumption, we can show that BAPG converges to its fixed point set. The effectiveness of BAPG has been validated through comprehensive numerical experiments in graph alignment and partition tasks, where it outperforms existing methods in terms of both solution quality and wall-clock time. The GW distance provides a flexible way to compare and couple probability distributions supported on different metric spaces. Although the GW distance has gained a lot of attention in the machine learning and data science communities, most existing algorithms for computing the GW distance are double-loop algorithms that require another iterative algorithm as a subroutine, making them not ideal for practical use. Recently, an entropy-regularized iterative sinkhorn projection algorithm called eBPG was proposed by Solomon et al. (2016), which has been proven to converge under the Kurdyka-Łojasiewicz framework.
arXiv.org Artificial Intelligence
Mar-12-2023