Collaborating Authors

Setting Up a Google Cloud Instance GPU for for Free


EDIT* This guide was written for fastai version 1, which at the current date and time (Jan 2018) is in the midst of transitioning to the a newer version, dubbed fastai v2. An updated guide will be coming soon. As a deep learning enthusiast in Malaysia, one of the biggest issues I have is securing a cheap GPU option to run my models on. If you're like me and come from a country where paying $80-$100 every month for AWS GPUs is too expensive, I'll show you how I set up a my GPU on GCP without incurring any cost at all. All of this can be done for free at the start.

Deep Learning Images For Google Cloud Engine, The Definitive Guide


Google Cloud Platform now provides machine learning images designed for deep learning practitioners. This article will cover the fundamentals of the Google Deep Learning Images, how they benefit the developer, creating a deep learning instance, and common workflows. The Google Deep Learning images are a set of prepackaged VM images with a deep learning framework ready to be run out of the box. Currently, there are images supporting TensorFlow, PyTorch, and generic high-performance computing, with versions for both CPU-only and GPU-enabled workflows. To understand which set of images is right for your use case, consult the graph below.

Setting up Tensorflow and GPUs on Google Cloud Platform to run your neural network implementations


After my teammates and I had completed our implementation of CycleGANs for our Computer Vision class project, we needed GPUs to run the python script containing the tensorflow code. Since we had multiple datasets, we could not run the code using a single dataset on the Blue Waters quota allotted to us and wait for it to get done. We needed more GPUs!!! So, while my teammates were involved in running it on Blue Waters, I decided to give Google Cloud Platform a try. After going through multiple blogs and tutorials to set up GPUs and tensorflow on Google Cloud, I realized that none of them would give me all the details in one place and therefore, I was compelled to write this blog to provide a step by step procedure on how to set up GPUs and tensorflow on Google Cloud Platform from start to finish. So lets get right to it.

4 Reasons Why You Should Use Google Colab for Your Next Project


Colaboratory, or Colab for short, is a Google Research product, which allows developers to write and execute Python code through their browser. Google Colab is an excellent tool for deep learning tasks. It is a hosted Jupyter notebook that requires no setup and has an excellent free version, which gives free access to Google computing resources such as GPUs and TPUs. Since Google Colab is built on top of vanilla Jupyter Notebook, which is built on top of Python kernel, let's look at these technologies before diving into why we should and how we can use Google Colab. There are several tools used in Python interactive programming environments.



You can find additional options for docker run here and workspace configuration options in the section below. The workspace provides a variety of configuration options that can be used by setting environment variables (via docker run option: --env). To persist the data, you need to mount a volume into /workspace (via docker run option: -v). The default work directory within the container is /workspace, which is also the root directory of the Jupyter instance. The /workspace directory is intended to be used for all the important work artifacts. Data within other directories of the server (e.g., /root) might get lost at container restarts. We strongly recommend enabling authentication via one of the following two options. For both options, the user will be required to authenticate for accessing any of the pre-installed tools.