Goto

Collaborating Authors

3 + 1 ways of running R on Amazon SageMaker

#artificialintelligence

The R programming language is one of the most commonly used languages in the scientific space, being one of the most commonly used languages for machine learning (probably second following python) and arguably the most popular language amongst mathematicians and statisticians. It is easy to get started with, free to use, with support for many scientific and visualisation libraries. While R can help you analyse your data, the more data you have the more compute power you require and the more impactful your analysis is, the more repeatability and reproducibility is required. Analysts and Data Scientists need to find ways to fulfil such requirements. In this post we briefly describe the main ways of running your R workloads on the cloud, making use of Amazon SageMaker, the end-to-end Machine Learning cloud offering of AWS.


Amazon SageMaker Now Supports TensorFlow 2.0

#artificialintelligence

You can now use this container on SageMaker, and take advantage of many advanced capabilities, such as building models using the SageMaker SDK in managed notebooks, hyperparameter tuning, and distributed training. You can also bring your own container for custom models by building off our base containers. Click here for more information on using TensorFlow with SageMaker, and here to see an example notebook of starting a job with Tensorflow 2.0.


Learn how to select ML instances on the fly in Amazon SageMaker Studio Amazon Web Services

#artificialintelligence

Amazon Web Services (AWS) is happy to announce the general availability of Notebooks within Amazon SageMaker Studio. Amazon SageMaker Studio supports on-the-fly selection of machine learning (ML) instance types, optimized and pre-packaged Amazon SageMaker Images, and sharing of Jupyter notebooks. You can switch a notebook from using a kernel on one instance type to another, for example from ml.t3.medium to ml.p3.2xlarge, without interrupting your work or managing infrastructure. Moving from one instance to another is seamless, and you can continue working while the instance launches. Your notebooks and data are available instantly on the new instance due to the Amazon Elastic File System (Amazon EFS) that is created for your Amazon SageMaker Studio domain.


Deploying your ML models to AWS SageMaker

#artificialintelligence

We faced some difficulties with Streamlit.io You can see our SageMaker implementation here. The purpose of this article is to provide a tutorial with examples showing how to deploy ML models to AWS SageMaker. This tutorial covers only deploying ML models that are not trained in SageMaker. It is more complicated to deploy your ML models that are trained outside of AWS SageMaker than training the models and deploy end-to-end within SageMaker.


Container Images

#artificialintelligence

Kubeflow Notebooks natively supports three types of notebooks, JupyterLab, RStudio, and Visual Studio Code (code-server), but any web-based IDE should work. Notebook servers run as containers inside a Kubernetes Pod, which means the type of IDE (and which packages are installed) is determined by the Docker image you pick for your server. We provide a number of example container images to get you started. These images provide a common starting point for Kubeflow Notebook containers. See custom images to learn how to extend them with your own packages.