Google has officially released its Edge TPU (TPU stands for tensor processing unit) processors in its new Coral development board and USB accelerator. The Edge TPU is Google's inference-focused application specific integrated circuit (ASIC) that targets low-power "edge" devices and complements the company's "Cloud TPU," which targets data centers. Credit: GoogleLast July, Google announced that it's working on a low-power version of its Cloud TPU to cater to Internet of Things (IoT) devices. The Edge TPU's main promise is to free IoT devices from cloud dependence when it comes to intelligent analysis of data. For instance, a surveillance camera would no longer need to identify objects it sees in real-time through cloud analysis and could instead do so on its own, locally, thanks to the Edge TPU.
TensorFlow Lite is TensorFlow's lightweight solution for mobile and embedded devices. It enables on-device machine learning inference with low latency and a small binary size. TensorFlow Lite also supports hardware acceleration with the Android Neural Networks API. TensorFlow Lite uses many techniques for achieving low latency such as optimizing the kernels for mobile apps, pre-fused activations, and quantized kernels that allow smaller and faster (fixed-point math) models. Most of our TensorFlow Lite documentation is on Github for the time being.
Google continues to expand its range of AI products and services with a trio of new hardware devices aimed at the development community, launching the new Google Coral brand. It includes a $150 development board featuring a removable system-on-module with one of its custom tensor processing unit (TPU) AI chips, a $74.99 USB accelerator USB dongle designed to speed up machine learning inference on existing Raspberry Pi and Linux systems, and a 5-megapixel camera that's available for $24.99. The Coral Development Board, which runs a derivative of Linux dubbed Mendel, spins up compiled and quantized TensorFlow Lite models with the aid of a quad-core NXP i.MX 8M system-on-chip paired with integrated GC7000 Lite Graphics, 1GB of LPDDR4 RAM, and 8GB of eMMC storage (expandable via microSD slot). It boasts a wireless chip that supports Wi-Fi 802.11b/g/n/ac 2.4/5GHz and Bluetooth 4.1, a 3.5mm audio jack, and a full-size HDMI 2.0a port, plus USB 2.0 and 3.0 ports, a 40-pin GPIO expansion header, and a Gigabit Ethernet port. The Coral USB Accelerator is basically a plug-in USB 3.0 stick to add machine learning capabilities to the existing Linux machines.
The TensorFlow Dev Summit 2019 continued to roll out the goodies with new updates to software and hardware announcements. When it comes to AI and machine learning, Google is no stranger to new innovations. Currently in Beta mode, Coral consists of a development board and a USB accelerator stick. It has low power demands for usage in embedded applications and can be deployed offline or in areas with limited Internet connectivity. See what powerful machine learning these pieces of hardware can do.
TensorFlow is the world's most popular open source machine learning library. Since its initial release in 2015, the Google Brain product has been downloaded over 41 million times. At this week's 2019 TensorFlow Dev Summit, Google announced a major upgrade on the framework, the TensorFlow 2.0 Alpha version. TensorFlow 2.0 focuses on simplicity and ease of use, with updates like eager execution, intuitive higher-level APIs, and flexible model building on any platform. Last August Google Brain Software Engineer Martin Wicke posted in Google Groups that TensorFlow 2.0 would be a major milestone, which led many in the machine learning community to expect the following upgrades: According to the TensorFlow 2.0 official guide, Google has delivered on the expectations.