Review: TensorFlow shines a light on deep learning

#artificialintelligence

Arguably it is machine intelligence, along with a vast sea of data to apply it to. While you may never have as much data to process as Google does, you can use the very same machine learning and neural network library as Google. That library, TensorFlow, was developed by the Google Brain team over the past several years and released to open source in November 2015. TensorFlow does computation using data flow graphs. Google uses TensorFlow internally for many of its products, both in its datacenters and on mobile devices.


Update: Google TensorFlow Deep Learning Is Improving

#artificialintelligence

The recent open sourcing of Google's TensorFlow was a significant event for machine learning. While the original release was lacking in some ways, development continues and improvements are already being made.


Deep Learning: TensorFlow Programming via XML and PMML

@machinelearnbot

This article demonstrates separation of the Neural Network problem specification and its solution code. In this approach, problem dataset and its Neural network are specified in a PMML like XML file. Then it is used to populate the TensorFlow graph, which, in turn run to get the results. Iris dataset is used as a data source in this approach. With suitable enhancements, other data sources, even different Neural Network types and other libraries can also be incorporated in it.


Introduction to TensorFlow Lite TensorFlow

#artificialintelligence

TensorFlow Lite is TensorFlow's lightweight solution for mobile and embedded devices. It enables on-device machine learning inference with low latency and a small binary size. TensorFlow Lite also supports hardware acceleration with the Android Neural Networks API. TensorFlow Lite uses many techniques for achieving low latency such as optimizing the kernels for mobile apps, pre-fused activations, and quantized kernels that allow smaller and faster (fixed-point math) models. Most of our TensorFlow Lite documentation is on Github for the time being.


Google Announces Tensorflow Lite: A Neural Network Library for Mobile Phones

#artificialintelligence

Dave Burke, VP of engineering at Google, announced a new version of Tensorflow optimised for mobile phones. This new library, called Tensorflow Lite, would enable developers to run their artificial intelligence applications in real time on the phones of users. According to Burke, the library is designed to be fast and small while still enabling state-of-the-art techniques. It will be released later this year as part of the open source Tensorflow project. At the moment, most artificial intelligence processing happens on servers of software as a service providers.