Goto

Collaborating Authors

How to Run Customized Tensorflow Training in the Cloud

#artificialintelligence

You have your Tensorflow code running locally. Now you want to set it up in a production environment for all that extra GPU Power. There are a couple of alternatives out there. The two more popular managed ML cloud platforms are Google Cloud ML Engine and AWS Sage Maker. They let you quickly deploy your models and train them.


saiprashanths/dl-docker

#artificialintelligence

Here are Dockerfiles to get you up and running with a fully functional deep learning machine. It contains all the popular deep learning frameworks with CPU and GPU support (CUDA and cuDNN included). The CPU version should work on Linux, Windows and OS X. If you are not familiar with Docker, but would still like an all-in-one solution, start here: What is Docker?. GPU Version Only: Install Nvidia drivers on your machine either from Nvidia directly or follow the instructions here. Note that you don't have to install CUDA or cuDNN.


Set up TensorFlow with Docker GPU in Minutes – Sicara Agile Big Data Development

@machinelearnbot

Docker is the best platform to easily install Tensorflow with a GPU. This tutorial aims demonstrate this and test it on a real-time object recognition application.


mmrl/dl

#artificialintelligence

This directory contains files to build Docker images - encapsulated computational containers which enhance reproducibility for scientific research. They are similar in design philosophy to the excellent Jupyter Docker Stacks but with a focus on making it easy to get up and running with GPU-accelerated deep learning. The base image provides a Jupyter Lab (notebook) environment in a Docker container which has direct access to the host system's GPU(s). Additionally there is a custom directory with instructions and examples for building your own image. These are considered stable but may be moved to their own repositories in future.


Docker and Deep Learning, a bad match - somatic blog

#artificialintelligence

If you don't know, docker has been around for a few years now to help with the deployment of applications using operating system level virtualization on Linux. It has a bunch of great features to help with this, but I would say the main use case for docker is that you can run any docker container on any docker host and it will run. "Docker containers run on any computer, on any infrastructure and in any cloud." Unfortunately that is wrong for deep learning applications. For any serious deep learning application, you need NVIDIA graphics cards, otherwise it could take months to train your models.