Goto

Collaborating Authors

"Not hotdog" vs. mission-critical AI applications for the enterprise

#artificialintelligence

Check out "Designing Data-Intensive Applications" to explore the pros and cons of various technologies for processing and storing data, and to learn how to make full use of data in modern applications. Artificial intelligence has come a long way since the concept was introduced in the 1950s. Until recently, the technology had an aura of intrigue, and many believed its place was strictly inside research labs and science fiction novels. Today, however, the technology has become very approachable. The popular TV show Silicon Valley recently featured an app called "Not Hotdog," based on cutting-edge machine learning frameworks, showcasing how easy it is to create a deep learning application.


A beginner's guide to AI: Computer vision and image recognition

#artificialintelligence

This is the second story in our continuing series covering the basics of artificial intelligence. While it isn't necessary to read the first article, which covers neural networks, doing so may add to your understanding of the topics covered in this one. Teaching a computer how to'see' is no small feat. You can slap a camera on a PC, but that won't give it sight. In order for a machine to actually view the world like people or animals do, it relies on computer vision and image recognition.


The ridiculous Not Hotdog app from 'Silicon Valley' is real

Engadget

Our long national nightmare is over: Thanks to HBO and Silicon Valley there's finally an app that will tell you if the object you pointed your phone's camera at is a hot dog or not. For fans of the show, it's a cute joke, but everyone else might be a little puzzled. As a brief bit of background, T.J. Miller's character Erlich Bachman accidentally invested in an app he thought had something to do with Oculus, when, in actuality, it was an application with recipes for preparing octopus rather than anything to do with virtual reality. A common mistake, to be sure. That led to pivoting the app to become the "Shazam of food."


AI Is Not Magic. How Neural Networks Learn

#artificialintelligence

In my previous blog post, I claimed that "AI is not magic." In this post, my goal is to discuss how neural networks learn, and show that AI isn't a crystal ball or magic, just science and some very slick mathematics. I'll keep this very high level. Let's start with a hypothetical scenario. Suppose we are building an app to identify hot dogs.


Video: R for AI, and the Not Hotdog workshop

#artificialintelligence

Earlier this year at the QCon.ai I also presented an interactive workshop, using R and the Microsoft Cognitive Services APIs, to automatically generate captions for images, and to create a tool to recognize images of hotdogs. Video from both the presentation, and the workshop (which starts at the 10:00 mark), is now available to view the QCon.ai You can find the slides for the presentation here. The R code for the "Not Hotdog" workshop is available as an Azure Notebook which you can clone and use to follow along with the workshop.