!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
–Neural Information Processing Systems
Stochastic Gradient Descent (SGD) is a popular algorithm that can achieve stateof-the-art performance on a variety of machine learning tasks. Several researchers have recently proposed schemes to parallelize SGD, but all require performancedestroying memory locking and synchronization. This work aims to show using novel theoretical analysis, algorithms, and implementation that SGD can be implemented without any locking.
Neural Information Processing Systems
Mar-14-2024, 23:39:06 GMT
- Country:
- North America > United States
- California (0.04)
- District of Columbia > Washington (0.04)
- Massachusetts > Middlesex County
- Belmont (0.04)
- Wisconsin > Dane County
- Madison (0.14)
- North America > United States
- Genre:
- Research Report (0.46)
- Industry:
- Information Technology (0.46)
- Technology: