Reviews: Variational PDEs for Acceleration on Manifolds and Application to Diffeomorphisms
–Neural Information Processing Systems
Summary: The main contribution of this paper is the derivation of an "accelerated" gradient descent scheme for computing the stationary point of a potential function on diffeomorphisms, inspired by the variational formulation of Nesterov's accelerated gradient methods [1]. The authors first derive the continuous time and space analogy of the Bregmann Lagrangian [1] for diffeomorphisms, then apply the discretization to solve image registration problems, empirically showing faster/better convergence than gradient descent. Pros: The paper is well-written. The proposed scheme of solving diffeomorphic registration by discretizing a variational solution similar to [1] is a novel contribution, to the best of my knowledge. The authors also show strong empirical support of the proposed method vs. gradient descent.
Neural Information Processing Systems
Oct-7-2024, 11:48:20 GMT
- Technology: