Goto

Collaborating Authors

 expedera


AI Power Consumption Exploding

#artificialintelligence

Machine learning is on track to consume all the energy being supplied, a model that is costly, inefficient, and unsustainable. To a large extent, this is because the field is new, exciting, and rapidly growing. It is being designed to break new ground in terms of accuracy or capability. Today, that means bigger models and larger training sets, which require exponential increases in processing capability and the consumption of vast amounts of power in data centers for both training and inference. In addition, smart devices are beginning to show up everywhere. But the collective power numbers are beginning to scare people.


Expedera raises funds to advance deep learning accelerator IP

#artificialintelligence

Expedera has completed a $18m Series A funding round led by Dr. Sehat Sutardja and Weili Dai (founders of Marvell Technology Group) and other leading semiconductor industry investors. So far, the company has raises $27m and it will be using this new funding to speed product development and expand sales and marketing to meet the demand for its high performance and energy-efficient deep learning accelerator (DLA) IP. A growing number of semiconductor chip makers are adding AI (Artificial Intelligence) inference capabilities to applications, including smartphones, smart speakers, security cameras, PC/tablets, wearables, automotive, and edge servers. "We expect shipments of AI-enabled edge devices to grow from about 600m units in 2020 to 2bn units in 2025, representing 26% annual growth," said Linley Gwennap, Principal Analyst at The Linley Group. "Smartphones, a market where Expedera already has traction, represent about half of these units."