Goto

Collaborating Authors

 jupyterhub


Getting a Peak of the Big Data/Cloud Computing Workflow Using AWS

#artificialintelligence

Originally published on Towards AI the World's Leading AI and Technology News and Media Company. If you are building an AI-related product or service, we invite you to consider becoming an AI sponsor. At Towards AI, we help scale AI and technology startups. Let us help you unleash your technology to the masses. Although I've had the chance now to play with these different technologies, I'm still amazed by the convenience, portability, and computing power that Big Data and Cloud Computing technologies offer, both to consumers and businesses.


A Run-down on Top 10 Open-Source Tools for Machine Learning

#artificialintelligence

Machine learning is making wonders across every industry. Disruptive technology is reshaping the way companies make decisions and deal with ever-growing data. Starting from chatbots to answer customer queries to detecting transaction frauds in banks, machine learning and its applications are streamlining many routine processes. In the past few years, the dominance of machine learning has stepped out of company floors. They are present in our everyday life.


Building an open ML platform with Red Hat OpenShift and Open Data Hub Project – Red Hat OpenShift Blog

#artificialintelligence

The Open Data Hub project integrates a number of open source tools to enable an end to end AI/ML workflow. Jupyter notebook ecosystem is provided as the primary interface and experience to the data scientist. And the main reason for that is that Jupyter is widely popular amongst data scientists today. Data scientists can create and manage their Jupyter notebooks workspaces with an embedded JupyterHub. While they can create or import new notebooks, Open Data Hub project also includes a number of pre-existing notebooks called the "AI Library".


Introducing Kubeflow - A Composable, Portable, Scalable ML Stack Built for Kubernetes

#artificialintelligence

Kubernetes and Machine Learning Kubernetes has quickly become the hybrid solution for deploying complicated workloads anywhere. While it started with just stateless services, customers have begun to move complex workloads to the platform, taking advantage of rich APIs, reliability and performance provided by Kubernetes. One of the fastest growing use cases is to use Kubernetes as the deployment platform of choice for machine learning. Building any production-ready machine learning system involves various components, often mixing vendors and hand-rolled solutions. Connecting and managing these services for even moderately sophisticated setups introduces huge barriers of complexity in adopting machine learning.


google/kubeflow

#artificialintelligence

The Kubeflow project is dedicated to making Machine Learning on Kubernetes easy, portable and scalable. Our goal is not to recreate other services, but to provide a straightforward way for spinning up best of breed OSS solutions. This document details the steps needed to run the kubeflow project in any environment in which Kubernetes runs. Our goal is to help folks use ML more easily, by letting Kubernetes to do what it's great at: Because ML practitioners use so many different types of tools, it is a key goal that you can customize the stack to whatever your requirements (within reason), and let the system take care of the "boring stuff." While we have started with a narrow set of technologies, we are working with many different projects to include additional tooling.