AI and Compute

#artificialintelligence

We're releasing an analysis showing that since 2012, the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.5 month-doubling time (by comparison, Moore's Law had an 18-month doubling period). Since 2012, this metric has grown by more than 300,000x (an 18-month doubling period would yield only a 12x increase). Improvements in compute have been a key component of AI progress, so as long as this trend continues, it's worth preparing for the implications of systems far outside today's capabilities. The chart shows the total amount of compute, in petaflop/s-days, that was used to train selected results that are relatively well known, used a lot of compute for their time, and gave enough information to estimate the compute used. A petaflop/s-day (pfs-day) consists of performing 1015 neural net operations per second for one day, or a total of about 1020 operations.


How the end of Moore's Law will usher in a new era in computing

#artificialintelligence

In 1965 Gordon Moore, the founder of Intel, predicted that the number of components that could fit on a microchip would double every year for the next decade. Moore revised his prediction in 1975 to a doubling of components every two years – a prophecy that remained true for another four decades. The ramifications on the world of technology and, by extension, society itself of what is now known as "Moore's Law" have proven immeasurable. The doubling of transistors – semi-conductor devices that switch electronic signals and power – meant that technology would become exponentially more powerful, smaller and cheaper. The fact that the smartphone in your pocket is now many thousands times more...


What Doubling Tricks Can and Can't Do for Multi-Armed Bandits

arXiv.org Machine Learning

An online reinforcement learning algorithm is anytime if it does not need to know in advance the horizon T of the experiment. A well-known technique to obtain an anytime algorithm from any non-anytime algorithm is the "Doubling Trick". In the context of adversarial or stochastic multi-armed bandits, the performance of an algorithm is measured by its regret, and we study two families of sequences of growing horizons (geometric and exponential) to generalize previously known results that certain doubling tricks can be used to conserve certain regret bounds. In a broad setting, we prove that a geometric doubling trick can be used to conserve (minimax) bounds in $R\_T = O(\sqrt{T})$ but cannot conserve (distribution-dependent) bounds in $R\_T = O(\log T)$. We give insights as to why exponential doubling tricks may be better, as they conserve bounds in $R\_T = O(\log T)$, and are close to conserving bounds in $R\_T = O(\sqrt{T})$.


Is Blockchain the Answer to Speeding AI Innovation? Finance Magnates

#artificialintelligence

As one of the latest trends in the tech world, artificial intelligence (AI) is expanding into a variety of industries in new ways all the time. Companies are rushing to implement AI for many uses including high-frequency trading, monitoring social media presence, monitoring security, and autonomous vehicles. Private companies aren't alone in their pursuit of AI either; in recent news, the U.S. Pentagon announced its intentions to invest $2 billion into AI. But with all this interest in the emerging world of AI, one of the bottlenecks holding innovation in the industry back is the lack of affordable access to computing power. Computing power plays a major role for AI and is crucial to the continued development in the industry.


The post-exponential era of AI and Moore's Law – TechCrunch

#artificialintelligence

My MacBook Pro is three years old, and for the first time in my life, a three-year-old primary computer doesn't feel like a crisis which must be resolved immediately. True, this is partly because I'm waiting for Apple to fix their keyboard debacle, and partly because I still cannot stomach the Touch Bar. But it is also because three years of performance growth ain't what it used to be. It is no exaggeration to say that Moore's Law, the mindbogglingly relentless exponential growth in our world's computing power, has been the most significant force in the world for the last fifty years. So its slow deceleration and/or demise are a big deal, and not just because the repercussions are now making their way into every home and every pocket.