Why Micron is Getting into the AI Accelerator Business

#artificialintelligence 

Micron has a habit of building interesting research prototypes that offer a vague hope of commercialization for the sheer purpose of learning how to make its own memory and storage subsystem approaches more tuned to next generation applications. We saw this a few years ago with the Automata processor, which was a neuromorphic inspired bit of hardware that focused on large-scale pattern recognition. That project has since folded internally and moved into a privately funded effort from a startup aiming to make it market ready, which is to say that it has all but disappeared from view since that was a couple of years ago. There is more here for anyone interested in the Automata architecture, but for those curious about why Micron wants to get into the accelerator business with one-off silicon projects like that or its newly announced deep learning accelerator (DLA) for inference, it's far less about commercial success than it is learning how to tune memory and storage systems for AI on custom accelerators. In fact, the market viability of such a chip would be a delightful bonus since the real value is getting a firsthand understanding of what deep learning applications need out of memory and storage subsystems.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found