Introduction to TensorFlow Lite TensorFlow

#artificialintelligence

TensorFlow Lite is TensorFlow's lightweight solution for mobile and embedded devices. It enables on-device machine learning inference with low latency and a small binary size. TensorFlow Lite also supports hardware acceleration with the Android Neural Networks API. TensorFlow Lite uses many techniques for achieving low latency such as optimizing the kernels for mobile apps, pre-fused activations, and quantized kernels that allow smaller and faster (fixed-point math) models. Most of our TensorFlow Lite documentation is on Github for the time being.


Google launches TensorFlow Lite for machine learning on mobile devices

@machinelearnbot

TensorFlow Lite for machine learning on mobile devices was first announced by Dave Burke, VP of engineering of Android at the Google I/O 2017. TensorFlow Lite is a lightweight version of Google's TensorFlow open source library that is mainly used for machine learning application by researchers and developers.


What is the TensorFlow machine intelligence platform?

@machinelearnbot

TensorFlow is an open source software library for numerical computation using data-flow graphs. It was originally developed by the Google Brain Team within Google's Machine Intelligence research organization for machine learning and deep neural networks research, but the system is general enough to be applicable in a wide variety of other domains as well. It reached version 1.0 in February 2017, and has continued rapid development, with 21,000 commits thus far, many from outside contributors. This article introduces TensorFlow, its open source community and ecosystem, and highlights some interesting TensorFlow open sourced models. It runs on nearly everything: GPUs and CPUs--including mobile and embedded platforms--and even tensor processing units (TPUs), which are specialized hardware to do tensor math on.


Build AI that works offline with Coral Dev Board, Edge TPU, and TensorFlow Lite

#artificialintelligence

These new devices are made by Coral, Google's new platform for enabling embedded developers to build amazing experiences with local AI. Coral's first products are powered by Google's Edge TPU chip, and are purpose-built to run TensorFlow Lite, TensorFlow's lightweight solution for mobile and embedded devices. As a developer, you can use Coral devices to explore and prototype new applications for on-device machine learning inference. Coral's Dev Board is a single-board Linux computer with a removable System-On-Module (SOM) hosting the Edge TPU. It allows you to prototype applications and then scale to production by including the SOM in your own devices.


Jeff Dean on machine learning, part 2: TensorFlow Google Cloud Big Data and Machine Learning Blog Google Cloud Platform

#artificialintelligence

TensorFlow is the machine-learning library open sourced by Google in November 2015. It gained over 11,000 stars on GitHub in its first week after launch, and has built up quite a community since then: at the time of this writing, TensorFlow has over 45,000 stars, 13,000 commits and 21,000 forks. This is the second installment in our interview series with Jeff Dean, Google Senior Fellow and lead of the Google Brain research team. In our first installment, we talked about the landscape of machine learning: its past, present and future. In this installment, we'll cover TensorFlow: why we built it originally, how to use it, and what its future may hold.