Goto

Collaborating Authors

Notebook meta-analysis: Jupyter as a zero-infrastructure alternative to experiment trackers

#artificialintelligence

Existing experiment trackers come with a high setup cost. To get one working, you usually have to spin up a database and run a web application. After trying multiple options, I thought that using Jupyter notebooks could be an excellent choice to store experiment results and retrieve them for comparison. This post explains how I use .ipynb Machine Learning is a highly iterative process: you don't know in advance what combination of model, features, and hyperparameters will work best, so you need to make slight tweaks and evaluate performance.


Jupyter Notebook as a Function -- Create Reusable Notebooks with Papermill

#artificialintelligence

In programming, functions are ways to modularize code into self contained, organized, and reusable blocks used to perform a specific task. Functions usually accept input data, process the data, and output a result. In this article we will examine how we can parameterize our Jupyter Notebook to make it work like a function. Let's imagine we are in the real estate business and the users have some questions related to real estate sales in the past few years. We will be using Singapore Housing Resale Price dataset[1] which contains transactional records of resale homes by the Housing Development Board (HDB).


Starting with Julia (using Jupyter) lab

#artificialintelligence

Okay, let's start with Juyter lab. I am using Jupyter notebooks no more because I feel lab is cool. In the above launcher, I select Julia 1.5.2 in the Notebook section: When you select a kernel, at the left pane you should be seeing a file called untitled.ipynb, Now let's try to print the legendary Hello World in the next cell Now let's print 1 2 3, for that type the following in the cell Now let's check if 1 equals 2 in the next cell: Once you have done this your notebook should look like this. A Jupyter lab instance is collection of many notebooks.



Getting Started with Jupyter+IntelligentGraph - DataScienceCentral.com

#artificialintelligence

Since IntelligentGraph combines Knowledge Graphs with embedded data analytics, Jupyter is an obvious choice as a data analysts IntelligentGraph workbench. Using the Jupyter ISparql, we can easily perform SPARQL queries over the same IntelligentGraph created above. We do not have to use Java to script our interaction with the repository.