Here's how Google is preparing Android for the AI-laden future

PCWorld

The future of Android will be a lot smarter, thanks to new programming tools that Google unveiled on Wednesday. The company announced TensorFlow Lite, a version of its machine learning framework that's designed to run on smartphones and other mobile devices, during the keynote address at its Google I/O developer conference. "TensorFlow Lite will leverage a new neural network API to tap into silicon-specific accelerators, and over time we expect to see [digital signal processing chips] specifically designed for neural network inference and training," said Dave Burke, Google's vice president of engineering for Android. "We think these new capabilities will help power a next generation of on-device speech processing, visual search, augmented reality, and more." The Lite framework will be made a part of the open source TensorFlow project soon, and the neural network API will come to the next major release of Android later this year.


Facebook's Caffe2 AI tools come to iPhone, Android, and Raspberry Pi

PCWorld

New intelligence can be added to mobile devices like the iPhone, Android devices, and low-power computers like Raspberry Pi with Facebook's new open-source Caffe2 deep-learning framework. Caffe2 can be used to program artificial intelligence features into smartphones and tablets, allowing them to recognize images, video, text, and speech and be more situationally aware. It's important to note that Caffe2 is not an AI program, but a tool allowing AI to be programmed into smartphones. It takes just a few lines of code to write learning models, which can then be bundled into apps. The release of Caffe2 is significant.


What being an "AI first" company means for Google

@machinelearnbot

Back at Google I/O, CEO Sundar Pichai outlined the company's vision as an "AI first" company, with a new focus on contextual information, machine learning, and using intelligent technology to improve customer experience. The launch of the Pixel 2 and 2 XL, the latest batch of Google Home products, and the Google Clips offer a glimpse into what this long-term strategic shift could mean. We'll get to Google's latest smartphones in a minute, but there's much more to explore about the company's latest strategy.


Google's Edge TPU Machine Learning Chip Debuts in Raspberry Pi-Like Dev Board

#artificialintelligence

Google has officially released its Edge TPU (TPU stands for tensor processing unit) processors in its new Coral development board and USB accelerator. The Edge TPU is Google's inference-focused application specific integrated circuit (ASIC) that targets low-power "edge" devices and complements the company's "Cloud TPU," which targets data centers. Credit: GoogleLast July, Google announced that it's working on a low-power version of its Cloud TPU to cater to Internet of Things (IoT) devices. The Edge TPU's main promise is to free IoT devices from cloud dependence when it comes to intelligent analysis of data. For instance, a surveillance camera would no longer need to identify objects it sees in real-time through cloud analysis and could instead do so on its own, locally, thanks to the Edge TPU.


AI Benchmark: Running Deep Neural Networks on Android Smartphones

arXiv.org Artificial Intelligence

Over the last years, the computational power of mobile devices such as smartphones and tablets has grown dramatically, reaching the level of desktop computers available not long ago. While standard smartphone apps are no longer a problem for them, there is still a group of tasks that can easily challenge even high-end devices, namely running artificial intelligence algorithms. In this paper, we present a study of the current state of deep learning in the Android ecosystem and describe available frameworks, programming models and the limitations of running AI on smartphones. We give an overview of the hardware acceleration resources available on four main mobile chipset platforms: Qualcomm, HiSilicon, MediaTek and Samsung. Additionally, we present the real-world performance results of different mobile SoCs collected with AI Benchmark that are covering all main existing hardware configurations.