Review for NeurIPS paper: A Decentralized Parallel Algorithm for Training Generative Adversarial Nets
–Neural Information Processing Systems
The paper presents a decentralized version of the Parallel Optimistic Stochastic Gradient algorithm. A non-asymptotic convergence theorem is given. The algorithm is suitable for generic smooth min-max optimization problems, including GANs. Concerns remained if the theory can recover the single-machine case, as well as more precise discussion needed about the requirements on the communication graphs (spectral gap vs max degree) and restricted communication setting. Authors confirmed in the response that the communication complexity is not logarithmic for general graphs.
Neural Information Processing Systems
Jan-26-2025, 03:16:22 GMT
- Technology: