Goto

Collaborating Authors

Performance Trends in AI

#artificialintelligence

Edit To Add: It's been brought to my attention that I was wrong to claim that progress in image recognition is "slowing down". As classification accuracy approaches 100%, obviously improvements in raw scores will be smaller, by necessity, since accuracy can't exceed 100%. If you look at negative log error rates rather than raw accuracy scores, improvement in image recognition (as measured by performance on the ImageNet competition) is increasing roughly linearly over 2010-2016, with a discontinuity in 2012 with the introduction of deep learning algorithms. Deep learning has revolutionized the world of artificial intelligence. But how much does it improve performance?


The Different Types Of Hardware AI Accelerators

#artificialintelligence

An AI accelerator is a kind of specialised hardware accelerator or computer system created to accelerate artificial intelligence apps, particularly artificial neural networks, machine learning, robotics, and other data-intensive or sensor-driven tasks. They usually have novel designs and typically focus on low-precision arithmetic, novel dataflow architectures or in-memory computing capability. As deep learning and artificial intelligence workloads grew in prominence in the last decade, specialised hardware units were designed or adapted from existing products to accelerate these tasks, and to have parallel high-throughput systems for workstations targeted at various applications, including neural network simulations. As of 2018, a typical AI integrated circuit chip contains billions of MOSFET transistors. Hardware acceleration has many advantages, the main being speed. Accelerators can greatly decrease the amount of time it takes to train and execute an AI model, and can also be used to execute special AI-based tasks that cannot be conducted on a CPU.


Neural Algorithms and Computing Beyond Moore's Law

Communications of the ACM

The impending demise of Moore's Law has begun to broadly impact the computing research community.38 Moore's Law has driven the computing industry for many decades, with nearly every aspect of society benefiting from the advance of improved computing processors, sensors, and controllers. Behind these products has been a considerable research industry, with billions of dollars invested in fields ranging from computer science to electrical engineering. Fundamentally, however, the exponential growth in computing described by Moore's Law was driven by advances in materials science.30,37 From the start, the power of the computer has been limited by the density of transistors. Progressive advances in how to manipulate silicon through advancing lithography methods and new design tools have kept advancing computing in spite of perceived limitations of the dominant fabrication processes of the time.37 There is strong evidence that this time is indeed different, and Moore's Law is soon to be over for good.3,38 Already, Dennard scaling, Moore's Law's lesser known but equally important parallel, appears to have ended.11 Dennard's scaling refers to the property that the reduction of transistor size came with an equivalent reduction of required power.8


The Singularity May Never Be Near

AI Magazine

There is both much optimisim and pessimism around artificial intelligence (AI) today. The optimists are investing millions of dollars, and even in some cases billions of dollars into AI. The pessimists, on the other hand, predict that AI will end many things: jobs, warfare, and even the human race. Both the optimists and the pessimists often appeal to the idea of a technological singularity, a point in time where machine intelligence starts to run away, and a new, more in- telligent “species” starts to inhabit the earth. If the optimists are right, this will be a moment that fundamentally changes our economy and our society. If the pessimists are right, this will be a moment that also fundamentally changes our economy and our society. It is therefore very worthwhile spending some time deciding if either of them might be right.


Infusing Machines with Intelligence - Part 1

#artificialintelligence

"Learning", "thinking", "intelligence", even "cognition"… Such words were once reserved for humans (and to a lesser extent, other highly complex animals), but have now seemingly been extended to a "species" of machines, machines infused with artificial intelligence or "AI". In October 2015, a computer program developed by Google DeepMind, named AlphaGo, defeated the incumbent European champion at the complex ancient Chinese board game of Go. In March 2016, AlphaGo went on to defeat the world champion, Lee Sedol. This seminal moment caught the world's attention, the media have since been incessantly covering every AI-related story, and companies from all walks of life have since been on a mission to add "artificial intelligence" to their business description. At Platinum we have been closely following the major technological trends for many years.