New AI systems on a chip will spark an explosion of even smarter devices - SiliconANGLE
Artificial intelligence is permeating everybody's lives through the face recognition, voice recognition, image analysis and natural language processing capabilities built into their smartphones and consumer appliances. Over the next several years, most new consumer devices will run AI natively, locally and, to an increasing extent, autonomously. But there's a problem: Traditional processors in most mobile devices aren't optimized for AI, which tends to consume a lot of processing, memory, data and battery on these resource-constrained devices. As a result, AI has tended to execute slowly on mobile and "internet of things" endpoints, while draining their batteries rapidly, consuming inordinate wireless bandwidth and exposing sensitive local information as data makes roundtrips in the cloud. That's why mass-market mobile and IoT edge devices are increasingly coming equipped with systems-on-a-chip that are optimized for local AI processing.
Apr-15-2018, 07:16:25 GMT
- Industry:
- Information Technology > Hardware (0.32)
- Semiconductors & Electronics (1.00)
- Technology:
- Information Technology
- Artificial Intelligence
- Machine Learning > Neural Networks
- Deep Learning (0.50)
- Natural Language (0.93)
- Vision (1.00)
- Machine Learning > Neural Networks
- Communications > Mobile (0.90)
- Artificial Intelligence
- Information Technology