AI Data Processing: Near-Memory Compute for Energy-Efficient Systems

#artificialintelligence 

Almost universally, today's systems must operate within limited system-level power budgets. For these power-bound systems, saving energy anywhere in the system enables more energy for compute and hence higher system performance. A tantalizing opportunity exists to achieve system-energy savings by keeping data commutes between memory and processing as short as possible. Energy savings should be the primary goal, our North Star for computing near memory. At the recent International Solid-State Circuits Conference (ISSCC), I gave a presentation titled: "We have rethought our commute; Can we rethink our data's commute?"