Goto

Collaborating Authors

Applying Machine Learning on Mobile Devices

#artificialintelligence

In the modern world, machine learning is used in various fields: image classification, consumer demand forecasts, film and music recommendations for particular people, clustering. At the same time, for fairly large models, the result computation (and to a much greater degree the training of the model) can be a resource-intensive operation. In order to use the trained models on devices other than the most powerful ones, Google introduced its TensorFlow Lite framework. To work with it, you need to train a model built using the TensorFlow framework (not Lite!) and then convert it to the TensorFlow Lite format. After that, the model can be easily used on embedded or mobile devices.


EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi

#artificialintelligence

A guide showing how to train TensorFlow Lite object detection models and run them on Android, the Raspberry Pi, and more! TensorFlow Lite is an optimized framework for deploying lightweight deep learning models on resource-constrained edge devices. TensorFlow Lite models have faster inference time and require less processing power, so they can be used to obtain faster performance in realtime applications. This guide provides step-by-step instructions for how train a custom TensorFlow Object Detection model, convert it into an optimized format that can be used by TensorFlow Lite, and run it on Android phones or the Raspberry Pi. The guide is broken into three major portions. Each portion will have its own dedicated README file in this repository. This repository also contains Python code for running the newly converted TensorFlow Lite model to perform detection on images, videos, or webcam feeds.


Introduction to TensorFlow Lite TensorFlow

#artificialintelligence

TensorFlow Lite is TensorFlow's lightweight solution for mobile and embedded devices. It enables on-device machine learning inference with low latency and a small binary size. TensorFlow Lite also supports hardware acceleration with the Android Neural Networks API. TensorFlow Lite uses many techniques for achieving low latency such as optimizing the kernels for mobile apps, pre-fused activations, and quantized kernels that allow smaller and faster (fixed-point math) models. Most of our TensorFlow Lite documentation is on Github for the time being.


Machine Learning -- Building an AI app -- the Easy Way (using Tensorflow and Android)

#artificialintelligence

This article describes a case study on building a mobile app that recognizes objects using machine learning. We have used Tensorflow Lite. Tensorflow Lite machine learning (ML) is an open source library provided by Google. This article mentions a brief on Tensorflow Lite. ML adds great power to our mobile application.


TensorFlow Mobile: Training and Deploying a Neural Network - inovex-Blog

#artificialintelligence

Smart Assistants, fancy image filters in Snapchat and apps like Prisma all have one thing in common--they are powered by Machine Learning. The use of Machine Learning in mobile apps is growing and new mobile apps are developed with Machine Learning based services as business models. In this blog series we want to give you hands-on advice on how you can train and deploy a convolutional neural network for image classification to a mobile app using the popular machine learning framework TensorFlow Mobile. Our task will be to classify images of houseplants which we have collected ourselves. You don't have to go and snap pictures of plants, however, because our approach is generic and can be used for training and deploying a convolutional neural network for image classification, independent of their subject.