analogue chip
Analogue computers could train AI 1000 times faster and cut energy use
Computers built with analogue circuits promise huge speed and efficiency gains over ordinary computers, but normally at the cost of accuracy. Analogue computers that rapidly solve a key type of equation used in training artificial intelligence models could offer a potential solution to the growing energy consumption in data centres caused by the AI boom. Laptops, smartphones and other familiar devices are known as digital computers, because they store and process data as a series of binary digits, either 0 or 1, and can be programmed to solve a range of problems. In contrast, analogue computers are normally designed to solve just one specific problem. They store and process data using quantities that can vary continuously such as electrical resistance, rather than discrete 0s and 1s.
Analogue chips can slash the energy used to run AI models
An analogue computer chip can run an artificial intelligence (AI) speech recognition model 14 times more efficiently than traditional chips, potentially offering a solution to the vast and growing energy use of AI research and to the worldwide shortage of the digital chips usually used. The device was developed by IBM Research, which declined New Scientist's request for an interview and didn't provide any comment. But in a paper outlining the work, researchers claim that the analogue chip can reduce bottlenecks in AI development. There is a global rush for GPU chips, the graphic processors that were originally designed to run video games and have also traditionally been used to train and run AI models, with demand outstripping supply. Studies have also shown that the energy use of AI is rapidly growing, rising 100-fold from 2012 to 2021, with most of that energy derived from fossil fuels. These issues have led to suggestions that the constantly increasing scale of AI models will soon reach an impasse.