How to Decide Between Amazon SageMaker and Microsoft Azure Machine Learning Studio

#artificialintelligence

But there are other tools that also claim to make machine learning easier and speed model development. I am wondering how they compare? So, this week, I am taking a look at Amazon SageMaker (SageMaker) and how it compares to Studio. What I found when I looked at SageMaker in comparison to Studio is a significantly different approach to model building. The vendors of each tool would both claim to offer a fully managed service that covers the entire machine learning workflow to build, train, and deploy machine learning models quickly.


Amazon Open Sources SageMaker Neo To Run Machine Learning Models At The Edge

#artificialintelligence

At re:Invent 2018, AWS added many capabilities to Amazon SageMaker, a machine learning platform as a service. SageMaker Neo was announced as an extension of SageMaker that optimizes fully-trained ML models for various deployment targets. Neo-AI project turns SageMaker Neo into an open source project making it possible for hardware and software vendors to extend the platform. Machine learning models have two distinct phases – training and inference. Data scientists and developers select the right algorithm that's most appropriate for the business problem.


Manage Machine Learning Experiments with Amazon SageMaker Model Tracking Capability - Amazon SageMaker

#artificialintelligence

To organize, find, and evaluate machine leaning model experiments, use Amazon SageMaker model tracking capabilities. Developing models typically requires extensive experimenting with different datasets, algorithms, and parameter values. Using the model tracking capability, you can search, filter and sort through hundreds and possibly thousands of experiments using model attributes such as parameters, metrics and tags. This helps you find the best model for your use case quickly. Find, organize, or evaluate training jobs using properties, hyperparameters, performance metrics, or any other metadata.


Retraining SageMaker models with Chalice and Serverless

#artificialintelligence

Amazon SageMaker makes it easy to train (and deploy) Machine Learning models at scale. Thanks to its Python SDK, developers can first experiment with their data set and model using a notebook instance. Once they're happy with a model, it's quite likely that they will need to train it again and again with new data or new parameters. The SageMaker SDK is great when experimenting, but it's too large to fit in a Lambda package. No worries though: the SageMaker client in boto3 includes a CreateTrainingJob API that will serve our purpose just fine.


Securing all Amazon SageMaker API calls with AWS PrivateLink Amazon Web Services

#artificialintelligence

All Amazon SageMaker API operations are now fully supported via AWS PrivateLink, which increases the security of data shared with cloud-based applications by reducing data exposure to the internet. In this blog, I show you how to set up a VPC endpoint to secure your Amazon SageMaker API calls using AWS PrivateLink. AWS PrivateLink traffic doesn't traverse the internet, which reduces the exposure to threats such as brute force and distributed denial of service attacks. Because all communication between your application and Amazon SageMaker API operations is inside your VPC, you don't need an internet gateway, a NAT device, a VPN connection, or AWS Direct Connect to communicate with Amazon SageMaker. Instead, AWS PrivateLink enables you to privately access all Amazon SageMaker API operations from your VPC in a scalable manner by using interface VPC endpoints.