But there are other tools that also claim to make machine learning easier and speed model development. I am wondering how they compare? So, this week, I am taking a look at Amazon SageMaker (SageMaker) and how it compares to Studio. What I found when I looked at SageMaker in comparison to Studio is a significantly different approach to model building. The vendors of each tool would both claim to offer a fully managed service that covers the entire machine learning workflow to build, train, and deploy machine learning models quickly.
Cloud computing has influenced the rise of machine learning and artificial intelligence. Factors such as affordable storage, availability of GPUs and FPGAs and advancements in deep learning made machine learning accessible and affordable to businesses. Mainstream cloud providers have shifted their focus from pushing traditional IaaS to selling PaaS based on machine learning. Cognitive APIs, automated ML, model management and preconfigured data science VMs backed by GPUs are going to drive the consumption of public cloud. AI and ML platforms are becoming the key differentiating factors for choosing a public cloud provider.
Image classification and object detection in images are hot topics these days, thanks to a combination of improvements in algorithms, datasets, frameworks, and hardware. These improvements democratized the technology and gave us the ingredients for creating our own solution for image classification. The state-of-the-art technologies for image classification and object detection are based on deep learning (DL). DL is a subarea of machine learning (ML) that is focused on algorithms for handling neural networks (NN) with many layers, or deep neural networks. ML, in turn, is a subarea of artificial intelligence (AI), a computer-science discipline.
Introduced at re:Invent 2017, Amazon SageMaker provides a serverless data science environment to build, train, and deploy machine learning models at scale. Customers also have the ability to work with frameworks they find most familiar, such as Scikit learn. In this blog post, we'll accomplish two goals: First, we'll give you a high-level overview of how Amazon SageMaker uses containers for training and hosting models. Second, we'll guide you through how to build a Docker container for training and hosting Scikit models in Amazon SageMaker. In the overview, we'll discuss how Amazon SageMaker runs Docker images that have been loaded from Amazon Elastic Container Service (ECS) for training and hosting models. We will also discuss the anatomy of a SageMaker Docker image, including the training code and inference code. If you are only interested in building, training, and deploying Scikit models in Amazon SageMaker, you can skip the overview. Instead, go right to the hands-on demonstration of how to containerize Scikit models in SageMaker with the minimal amount of effort. SageMaker makes extensive use of Docker containers to allow users to train and deploy algorithms. Containers allow developers and data scientists to package software into standardized units that run consistently on any platform that supports Docker. Containerization packages code, runtime, system tools, system libraries and settings all in the same place, isolating it from its surroundings, and insuring a consistent runtime regardless of where it is being run.