Goto

Collaborating Authors

Jupyter Notebooks: Fundamentals of Machine Learning and Deep Learning

#artificialintelligence

Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2. You can access this material here. For other free tutorials (including from Berkeley, Harvard, Columbia, Google, Microsoft and so on), follow this link.


The Guerrilla Guide to Machine Learning with Python Deep_In_Depth : Data Science and Deep Learning

@machinelearnbot

This repo contains an incremental sequence of notebooks designed to teach deep learning, Apache MXNet (incubating), and the gluon interface. Our goal is to leverage the strengths of Jupyter notebooks to present prose, graphics, equations, and code together in one place. If we're successful, the result will be a resource that could be simultaneously a book, course material, a prop for live tutorials, and a resource for plagiarising (with our blessing) useful code. To our knowledge there's no source out there that teaches either (1) the full breadth of concepts in modern deep learning or (2) interleaves an engaging textbook with runnable code. We'll find out by the end of this venture whether or not that void exists for a good reason.


JavierAntoran/Bayesian-Neural-Networks

#artificialintelligence

The project is written in python 2.7 and Pytorch 1.0.1. If CUDA is available, it will be used automatically. The models can also run on CPU as they are not excessively big. We carried out homoscedastic and heteroscedastic regression experiements on toy datasets, generated with (Gaussian Process ground truth), as well as on real data (six UCI datasets). The heteroscedastic notebooks contain both toy and UCI dataset experiments for a given (ModelName).


tb0yd/rootfinder

#artificialintelligence

A fun little project to play with Jupyter Notebooks, Scikit-learn, and neural nets with Keras. To train a neural network to learn Arabic morphology. It's not very accurate (about 50%) so it's pretty addictive to work on. Surely, someone, somewhere, has done this better, but we aren't solving world hunger here, just having some nerdy fun. The below output is from roots-latin.py.ipynb, which has Latinization in the display stage in Buckwalter mode.


The NLP Cypher

#artificialintelligence

The full modeling code, written in Mesh TensorFlow and designed to be run on TPUs. Optimizer states, which allow you to continue training the model from where EleutherAI left off. A Google Colab notebook that shows you how to use the code base to train, fine-tune, and sample from a model." Their notebook requires a Google storage bucket to access their data since TPUs can't be read from local file systems. You can set up a free-trial fairly easily, they provide a link in the notebook.