Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees
–Neural Information Processing Systems
V ariational inequalities in general and saddle point problems in particular are increasingly relevant in machine learning applications, including adversarial learning, GANs, transport and robust optimization. With increasing data and problem sizes necessary to train high performing models across various applications, we need to rely on parallel and distributed computing. However, in distributed training, communication among the compute nodes is a key bottleneck during training, and this problem is exacerbated for high dimensional and over-parameterized models. Due to these considerations, it is important to equip existing methods with strategies that would allow to reduce the volume of transmitted information during training while obtaining a model of comparable quality.
Neural Information Processing Systems
Aug-15-2025, 01:43:48 GMT
- Country:
- Asia
- Middle East
- Jordan (0.04)
- Saudi Arabia (0.04)
- Russia (0.28)
- Middle East
- Europe > Russia
- Central Federal District > Moscow Oblast > Moscow (0.04)
- Asia
- Genre:
- Research Report (0.46)
- Technology: