!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
–Neural Information Processing Systems
Stochastic Gradient Descent (SGD) is a popular algorithm that can achieve stateof-the-art performance on a variety of machine learning tasks. Several researchers have recently proposed schemes to parallelize SGD, but all require performancedestroying memory locking and synchronization. This work aims to show using novel theoretical analysis, algorithms, and implementation that SGD can be implemented without any locking.
Neural Information Processing Systems
Mar-14-2024, 23:39:06 GMT
- Country:
- North America > United States > Wisconsin > Dane County > Madison (0.14)
- Genre:
- Research Report (0.46)
- Industry:
- Information Technology (0.46)
- Technology: