Export Reviews, Discussions, Author Feedback and Meta-Reviews
–Neural Information Processing Systems
"NIPS Neural Information Processing Systems 8-11th December 2014, Montreal, Canada",,, "Paper ID:","24" "Title:","Communication Efficient Distributed Machine Learning with the Parameter Server" Current Reviews First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. This paper presents improvements on a system for large-scale learning known as parameter server. The parameter server is designed to perform reliable distributed machine learning in large-scale industrial systems (1000's of nodes). The architecture is based on a bipartite graph composed by servers and workers. Workers compute gradients based on subsets of the training instances, while servers aggregate the workers' gradients, update the shared parameter vector and redistribute it to the workers for the next iteration.
Neural Information Processing Systems
Oct-3-2025, 00:42:30 GMT