Goto

Collaborating Authors

 neolithic



Lower Bounds and Nearly Optimal Algorithms in Distributed Learning with Communication Compression Xinmeng Huang 1 Yiming Chen 2,3 Wotao Yin

Neural Information Processing Systems

Recent advances in distributed optimization and learning have shown that communication compression is one of the most effective means of reducing communication. While there have been many results for convergence rates with compressed communication, a lower bound is still missing. Analyses of algorithms with communication compression have identified two abstract properties that guarantee convergence: the unbiased property or the contrac-tive property.


In Northern Scotland, the Neolithic Age Never Ended

The New Yorker

Megalithic monuments in the otherworldly Orkney Islands remain a fundamental part of the landscape. Sheep linger at the Stones of Stenness, the remnants of a ceremonial circle. The Stones of Stenness, a brood of lichen-encrusted megaliths in the far north of the British Isles, could be mistaken for a latter-day work of land art, one with ominous overtones. The stones stand between two lochs on the largest of the Orkney Islands, off the northeastern tip of mainland Scotland. Three colossal planks of sandstone, ranging in height from fifteen feet nine inches to eighteen feet eight inches, rise from the grass, along with a smaller stone that has the bent shape of a boomerang. In contrast to the rectilinear blocks at Stonehenge, the Stenness megaliths are thin slabs with angled upper edges, like upside-down guillotine blades. Remnants of a ceremonial circle, they are placed twenty or more feet apart, creating a chasm of negative space. The monoliths in "2001: A Space Odyssey" inevitably come to mind. Given that the stones were erected five thousand years ago by a culture that left no trace of its belief system, it is unwise to project modern aesthetics onto them. Still, they can be seen only with living eyes. During a recent visit to Orkney, I kept returning to Stenness, at all hours and in all weather. On drizzly days, with skies hanging low, the stones resemble ladders to nowhere. In bright sun, hidden colors emerge: streaks of blue against gray; white and green spatters of lichen; yellowish stains indicating the presence of limonite, an iron ore. Pockmarks and brittle edges show the abrading action of millennia of wind and rain. I watched as tourists approached the stones and hesitantly touched them, as if afraid. When I put my own hands on the rock, I felt no obvious emanations, though I did not feel nothing. One evening, I leaned on a fence as the sun went down, the horizon glowing orange against a cobalt sky.



Lower Bounds and Nearly Optimal Algorithms in Distributed Learning with Communication Compression Xinmeng Huang 1 Yiming Chen 2,3 Wotao Yin

Neural Information Processing Systems

Recent advances in distributed optimization and learning have shown that communication compression is one of the most effective means of reducing communication. While there have been many results for convergence rates with compressed communication, a lower bound is still missing. Analyses of algorithms with communication compression have identified two abstract properties that guarantee convergence: the unbiased property or the contrac-tive property.


Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression

He, Yutong, Huang, Xinmeng, Chen, Yiming, Yin, Wotao, Yuan, Kun

arXiv.org Artificial Intelligence

Communication compression is an essential strategy for alleviating communication overhead by reducing the volume of information exchanged between computing nodes in large-scale distributed stochastic optimization. Although numerous algorithms with convergence guarantees have been obtained, the optimal performance limit under communication compression remains unclear. In this paper, we investigate the performance limit of distributed stochastic optimization algorithms employing communication compression. We focus on two main types of compressors, unbiased and contractive, and address the best-possible convergence rates one can obtain with these compressors. We establish the lower bounds for the convergence rates of distributed stochastic optimization in six different settings, combining strongly-convex, generally-convex, or non-convex functions with unbiased or contractive compressor types. To bridge the gap between lower bounds and existing algorithms' rates, we propose NEOLITHIC, a nearly optimal algorithm with compression that achieves the established lower bounds up to logarithmic factors under mild conditions. Extensive experimental results support our theoretical findings. This work provides insights into the theoretical limitations of existing compressors and motivates further research into fundamentally new compressor properties.


Lower Bounds and Nearly Optimal Algorithms in Distributed Learning with Communication Compression

Huang, Xinmeng, Chen, Yiming, Yin, Wotao, Yuan, Kun

arXiv.org Artificial Intelligence

Recent advances in distributed optimization and learning have shown that communication compression is one of the most effective means of reducing communication. While there have been many results on convergence rates under communication compression, a theoretical lower bound is still missing. Analyses of algorithms with communication compression have attributed convergence to two abstract properties: the unbiased property or the contractive property. They can be applied with either unidirectional compression (only messages from workers to server are compressed) or bidirectional compression. In this paper, we consider distributed stochastic algorithms for minimizing smooth and non-convex objective functions under communication compression. We establish a convergence lower bound for algorithms whether using unbiased or contractive compressors in unidirection or bidirection. To close the gap between the lower bound and the existing upper bounds, we further propose an algorithm, NEOLITHIC, which almost reaches our lower bound (up to logarithm factors) under mild conditions. Our results also show that using contractive bidirectional compression can yield iterative methods that converge as fast as those using unbiased unidirectional compression. The experimental results validate our findings.


No Bad Apples: Artificial Intelligence Checks Fruit Inside And Out

#artificialintelligence

You're looking for bruises on an apple or squeezing an avocado in your local supermarket, but the chances are it's already been checked – inside and out – by artificial intelligence. New software can analyze every aspect of fruit and vegetables before they reach supermarket shelves. It can determine a product's shelf life, and check for internal rot and pesticide residues. It integrates sensors and advanced optics into 360-degree cameras that see far more than the human eye, and that means a drastic reduction in food loss. As much as a fifth of all fresh produce is lost before it ever reaches the grocery store.