Goto

Collaborating Authors

Train and host Scikit-Learn models in Amazon SageMaker by building a Scikit Docker container Amazon Web Services

#artificialintelligence

Introduced at re:Invent 2017, Amazon SageMaker provides a serverless data science environment to build, train, and deploy machine learning models at scale. Customers also have the ability to work with frameworks they find most familiar, such as Scikit learn. In this blog post, we'll accomplish two goals: First, we'll give you a high-level overview of how Amazon SageMaker uses containers for training and hosting models. Second, we'll guide you through how to build a Docker container for training and hosting Scikit models in Amazon SageMaker. In the overview, we'll discuss how Amazon SageMaker runs Docker images that have been loaded from Amazon Elastic Container Service (ECS) for training and hosting models. We will also discuss the anatomy of a SageMaker Docker image, including the training code and inference code. If you are only interested in building, training, and deploying Scikit models in Amazon SageMaker, you can skip the overview. Instead, go right to the hands-on demonstration of how to containerize Scikit models in SageMaker with the minimal amount of effort. SageMaker makes extensive use of Docker containers to allow users to train and deploy algorithms. Containers allow developers and data scientists to package software into standardized units that run consistently on any platform that supports Docker. Containerization packages code, runtime, system tools, system libraries and settings all in the same place, isolating it from its surroundings, and insuring a consistent runtime regardless of where it is being run.


Train and deploy deep learning models using JAX with Amazon SageMaker

#artificialintelligence

Amazon SageMaker is a fully managed service that enables developers and data scientists to quickly and easily build, train, and deploy machine learning (ML) models at any scale. Typically, you can use the pre-built and optimized training and inference containers that have been optimized for AWS hardware. Although those containers cover many deep learning workloads, you may have use cases where you want to use a different framework or otherwise customize the contents of your OS libraries within the container. To accommodate this, SageMaker provides the flexibility to train models using any framework that can run in a Docker container. This functionality enables you to use existing SageMaker training capabilities such as training jobs, hyperparameter tuning, and Managed Spot Training.


Building fully custom machine learning models on AWS SageMaker: a practical guide

#artificialintelligence

AWS SageMaker is a cloud machine learning SDK designed for speed of iteration, and it's one of the fastest-growing toys in the Amazon AWS ecosystem. Since launching in late 2017 SageMaker's growth has been remarkable -- last year's AWS re:Invent stated that there are now over 10,000 companies using SageMaker to standardize their machine learning processes. SageMaker allows you to to use a Jupyter notebook interface to launch and tear down machine learning processes in handfuls of lines of Python code, something that makes data scientists happy because it abstracts away many of the messy infrastructural details to training. The thesis: standing up your own machine learning algorithm should always be this easy! SageMaker has two APIs: a high-level API for working with a variety of pre-optimized machine learning libraries (like MXNet, TensorFlow, and scikit-learn), and a low-level API that allows running completely custom jobs where anything goes.


Building fully custom machine learning models on AWS SageMaker: a practical guide

#artificialintelligence

AWS SageMaker is a cloud machine learning SDK designed for speed of iteration, and it's one of the fastest-growing toys in the Amazon AWS ecosystem. Since launching in late 2017 SageMaker's growth has been remarkable -- last year's AWS re:Invent stated that there are now over 10,000 companies using SageMaker to standardize their machine learning processes. SageMaker allows you to to use a Jupyter notebook interface to launch and tear down machine learning processes in handfuls of lines of Python code, something that makes data scientists happy because it abstracts away many of the messy infrastructural details to training. The thesis: standing up your own machine learning algorithm should always be this easy! SageMaker has two APIs: a high-level API for working with a variety of pre-optimized machine learning libraries (like MXNet, TensorFlow, and scikit-learn), and a low-level API that allows running completely custom jobs where anything goes.


Host multiple TensorFlow computer vision models using Amazon SageMaker multi-model endpoints

#artificialintelligence

Amazon SageMaker helps data scientists and developers prepare, build, train, and deploy high-quality machine learning (ML) models quickly by bringing together a broad set of capabilities purpose-built for ML. SageMaker accelerates innovation within your organization by providing purpose-built tools for every step of ML development, including labeling, data preparation, feature engineering, statistical bias detection, AutoML, training, tuning, hosting, explainability, monitoring, and workflow automation. Companies are increasingly training ML models based on individual user data. For example, an image sharing service designed to enable discovery of information on the internet trains custom models based on each user's uploaded images and browsing history to personalize recommendations for that user. The company can also train custom models based on search topics for recommending images per topic.