jupyter notebook


AWS's Web-based IDE for ML Development: SageMaker Studio

#artificialintelligence

AWS, Azure, Google Cloud, IBM Cloud, Oracle – they'll all vying to become the dominant force of gravity in the public cloud services market, and among the most fiercely fought over areas of cloud leadership is AI/machine learning enablement. Given that AI's TAM is roughly * and that the FAANGs are out ahead of everyone on AI expertise, it makes sense they would commercialize the technologies they use and that they've developed to attract enterprise AI customers to their platforms. A centerpiece of AWS's AI market strategy is SageMaker, a managed service that provides developers and data scientists who aren't necessarily ML experts with the tools to build, train and deploy ML models. Launched two years ago, AWS has designed SageMaker to lighten the heavy lifting from each step of the machine learning process. Since its inception, the product suite has been expanded into SageMaker Studio, which AWS CEO Andy Jassy, at the annual re:Invent conference in Las Vegas this week, described as an integrated, web-based IDE (interactive development environment) for machine learning that lets developers collect and store code, notebooks, data sets, settings and project folders in a single setting.


#Jupyter on Steroids: Create Packages, Tests, and Rich Documents

#artificialintelligence

"I really do think [nbdev] is a huge step forward for programming environments": Chris Lattner, inventor of Swift, LLVM, and Swift Playgrounds. It is a Python programming environment called nbdev, which allows you to create complete python packages, including tests and a rich documentation system, all in Jupyter Notebooks. We've already written a large programming library (fastai v2) using nbdev, as well as a range of smaller projects. Nbdev is a system for something that we call exploratory programming. Exploratory programming is based on the observation that most of us spend most of our time as coders exploring and experimenting.


Why Traders and Finance Professionals Need to Learn Python

#artificialintelligence

So what are the solution to those traders and financial professionals who find Excel limiting (if not outdated)? Python is reasonably easy to learn and very versatile and hence there is an increased uptake within the financial community. It is now a prerequisite for many quantitative roles, alongside with Excel. It is less elaborate than C (or Java), meaning that: ❶ the learning curve is not as steep, and also ❷ the amount of code required to complete a task is substantially smaller by a factor of 5x or 10x. Python's growing popularity is evident in the vast number of libraries that support pretty much anything you will need as a trader: Find a concise summary of libraries here (not affiliated).


rlabbe/Kalman-and-Bayesian-Filters-in-Python

#artificialintelligence

All code is written in Python, and the book itself is written using Juptyer Notebook so that you can run and modify the code in your browser. What better way to learn? "Kalman and Bayesian Filters in Python" looks amazing! We've been using it internally to teach some key state estimation concepts to folks and it's been a huge help. The world is full of data and events that we want to measure and track, but we cannot rely on sensors to give us perfect information. Each time I pass the same point in the road it reports a slightly different altitude. My kitchen scale gives me different readings if I weigh the same object twice.


AWS re:Invent 2019 - Predictions And A Wishlist

#artificialintelligence

With less than a week to go, the excitement and anticipation are building up for industry's largest cloud computing conference - AWS re:Invent. As an analyst, I have been attempting to predict the announcements from re:Invent (2018, 2017) with decent accuracy. But with each passing year, it's becoming increasingly tough to predict the year-end news from Vegas. Amazon is venturing into new areas that are least expected by the analysts, customers, and its competitors. AWS Ground Station is an example of how creative the teams at Amazon can get in conceiving new products and services.


How to create a machine learning dataset from scratch?

#artificialintelligence

My grandmother was an outstanding cook. So when I recently came across her old cook book I tried to read through some of the recipes, hoping I could recreate some of the dishes I enjoyed as a kid. However this turned out harder than expected since the book was printed around 1911 in a typeface called fraktur. For example the letter "A" looks like a "U" in fraktur and every time I see a "Z" in fraktur I read a "3" (see Figure 2). So the idea emerged to develop a pipeline that will create a live translation of the fraktur letters into a modern typeface.


ML.NET in Jupyter Notebooks

#artificialintelligence

Alright, enough talking, let's see some code! Install Jupyter - There are many ways to install Jupyter, but the easiest way is to download and install Anaconda. Install dotnet try - The C# kernel is based on the dotnet try tool. Install the .NET Jupyter Kernel – to connect Jupyter with the dotnet try tool, execute the following command in a command prompt or PowerShell to install the .NET Kernel: dotnet try jupyter install To start Jupter Notebooks, open Anaconda and click on Jupyter Notebook. It's always easier to learn something new using examples.


Machine Learning From Scratch [Part 1] – Bruno Campos

#artificialintelligence

In this lesson, you'll learn how to: Here, I'll show you the logic behind each technique, and you are going to be able to apply machine learning in different situations. No more talking, let's get straight to it. Assuming that you have Anaconda and Jupyter Notebooks installed, create a new notebook. Let's import the pyplot module from the library matplotlib. Pyplot is useful for generating simple charts from data.


Deployment of Machine Learning Models

#artificialintelligence

Learn how to put your machine learning models into production. Deployment of machine learning models, or simply, putting models into production, means making your models available to your other business systems. By deploying models, other systems can send data to them and get their predictions, which are in turn populated back into the company systems. Through machine learning model deployment, you and your business can begin to take full advantage of the model you built. When we think about data science, we think about how to build machine learning models, we think about which algorithm will be more predictive, how to engineer our features and which variables to use to make the models more accurate.


Create a predictive system for image classification using deep learning as a service

#artificialintelligence

In this pattern, learn how to create and deploy deep learning models by using a Jupyter Notebook in an IBM Watson Studio environment. You also create deep learning experiments with hyperparameters optimization by using a Watson Studio GUI for monitoring different runs, then select the best model for deployment. Computer vision is on the rise, and there might be scenarios where a machine must classify images based on their class to aid in the decision-making process. In this code pattern, we demonstrate how to do multiclass classification (with three classes) by using IBM Watson Studio and IBM Deep Learning as a Service. We use yoga postures data to identify the class given an image.