If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Amazon SageMaker makes it easy to train and deploy Machine Learning models hosted on HTTP endpoints. However, in most cases you won't expose these endpoints directly. Pre-processing and post-processing steps are likely to be required: authentication, throttling, data transformation and enrichment, logging, etc. In this post, we will use AWS Chalice to build a web service acting as a front-end for a SageMaker endpoint. I've already written a couple of posts (here and here) on training and deploying SageMaker models, so I won't go into these details again.
TL;DR - You can take the ML course on Coursera and you're magically a data scientist, because three really intelligent people did it. I'm not claiming the people referenced in this article are not data scientists who score high in Kaggle competitions. They're probably really intelligent people who picked up a new skill and excelled at it (although one was already an actuary, so he is basically doing machine learning in some form already). Here is my problem with it - being a data scientist usually requires a much larger skill set than a basic understanding of a few learning algorithms. I'm taking the Coursera ML course right now, and I think it is great!
For the past month, we ranked nearly 250 Machine Learning Open Source Projects to pick the Top 10. We compared projects with new or major release during this period. Mybridge AI ranks projects based on a variety of factors to measure its quality for professionals. Open source projects can be useful for programmers. Hope you find an interesting project that inspires you.
You can find links to all of the posts in the introduction, and a book based on the R series on Amazon. This blog post is a brief introduction to using the Keras deep learning framework to solve classic (shallow) machine learning problems. It presents a case study from my experience at Windfall Data, where I worked on a model to predict housing prices for hundreds of millions of properties in the US. I recently started reading "Deep Learning with R", and I've been really impressed with the support that R has for digging into deep learning. However, now that I'm porting my blog series to Python, I'll be using the Keras library directly, rather than the R wrapper.
In today's world, every customer is faced with multiple choices. For example, If I'm looking for a book to read without any specific idea of what I want, there's a wide range of possibilities how my search might pan out. I might waste a lot of time browsing around on the internet and trawling through various sites hoping to strike gold. I might look for recommendations from other people. But if there was a site or app which could recommend me books based on what I have read previously, that would be a massive help. Instead of wasting time on various sites, I could just log in and voila! 10 recommended books tailored to my taste. This is what recommendation engines do and their power is being harnessed by most businesses these days. From Amazon to Netflix, Google to Goodreads, recommendation engines are one of the most widely used applications of machine learning techniques. In this article, we will cover various types of recommendation engine algorithms and fundamentals of creating them in Python. We will also see the mathematics behind the workings of these algorithms. Finally, we will create our own recommendation engine using matrix factorization.
Deep Learning A-Z: Hands-On Artificial Neural Networks by Kirill Eremenko and Hadelin de Ponteves will teach you Deep Learning with Artificial Neural Networks. You will work with Tensorflow and Pytorch to build several different types of Neural Networks. Data Science: Deep Learning in Python by Lazy Programmer Inc. will teach you build Neural Networks from scratch in Python, numpy & TensorFlow. You will learn about the various types and terms associated to neural networks. Natural Language Processing with Deep Learning in Python by Lazy Programmer Inc. will teach you everything about deriving and implementing word2vec, GLoVe, word embeddings, and sentiment analysis with recursive nets.
One of the important fields of Artificial Intelligence is Computer Vision. Computer Vision is the science of computers and software systems that can recognize and understand images and scenes. Computer Vision is also composed of various aspects such as image recognition, object detection, image generation, image super-resolution and more. Object detection is probably the most profound aspect of computer vision due the number practical use cases. In this tutorial, I will briefly introduce the concept of modern object detection, challenges faced by software developers, the solution my team has provided as well as code tutorials to perform high performance object detection.
Hear the story of how we used Python to build an AI that plays Super StreetFighter II on the Super NES. We'll cover how Python provided the key glue between the SNES emulator and AI, and how the AI was built with gym, keras-rl and tensorflow . We'll show examples of game play and training, and talk about which bot beat which bot in the bot-v-bot tournament we ran. After this talk you'll know how easy it is to use Python and Python's machine learning libraries to teach a computer to play games. You'll see a practical example of the same type of machine learning used by AlphaGo, and also get to find out which character in StreetFighter II is best to pick when playing your friends.
An offspring of the Google team, Tensorflow is one of the most advanced Python frameworks for machine learning that implements deep machine learning algorithms. It is a second-generation, open-source system, the predecessor of which was the less integral recognition solution DistBelief. Despite its high learning curve, the product can nevertheless provide developers with a number of capabilities (alternatively, you can choose from other popular machine learning frameworks with steeper learning curves, like Theano). In particular, Tensorflow features tools that allow executing the input data analysis both with the help of encyclopedic data and the data previously analyzed during the interaction with certain users (supervisors). Although Tensorflow's final results are characterized by a high level of precision, developers usually prefer not to use it in scientific software development.