Goto

Collaborating Authors

Joe, Florian and Sebastian on the Indy Autonomous Challenge

#artificialintelligence

You want to find network errors fast and need a reliable tool. You are not alone, try Fiddler Everywhere free for 30-days and let this all in one web debugging solution deliver the successful outcomes you expect. My name is Roland Meertens, editor for AI and machine learning at InfoQ and product manager at Annotell. Today I will host the podcast and I will be talking with Joe, Sebastian, and Florian about the Indy Autonomous Challenge and how they managed to win it. Could you maybe introduce yourself to the listeners and tell them about what the Indy Autonomous Challenge is? Joe Speed: Sure, happy to. I'm a technical advisor for the Indy Autonomous Challenge, which is an amazing university challenge for autonomous racing. I was part of the TUM Autonomous Motorsport team. We managed to win the challenge in the end, and my main responsibilities were the mapping and localization part. And my main responsibility in our team was the perception, mainly the object detection. Roland Meertens: All right, maybe we can get started with you Joe. Maybe you can say that about what this Indy Autonomous Challenge is. Joe Speed: It's an amazing program. So, a lot of this is anecdotal. So Sebastian Thrun, who is very much like the godfather of modern autonomous driving, he had won the DARPA Grand Challenge. He was out at Indy and had commented something like, "Some of the things happening autonomy are not that exciting to me anymore, but if this, if the Indy 500 was autonomous, that would be interesting.


microsoft/ML-For-Beginners

#artificialintelligence

Azure Cloud Advocates at Microsoft are pleased to offer a 12-week, 24-lesson curriculum all about Machine Learning. In this curriculum, you will learn about what is sometimes called classic machine learning, using primarily Scikit-learn as a library and avoiding deep learning, which is covered in our forthcoming'AI for Beginners' curriculum. Travel with us around the world as we apply these classic techniques to data from many areas of the world. Each lesson includes pre- and post-lesson quizzes, written instructions to complete the lesson, a solution, an assignment and more. Our project-based pedagogy allows you to learn while building, a proven way for new skills to'stick'.


Neural Nets And Game Boy Cameras

#artificialintelligence

Released in 1998, the Game Boy camera was perhaps the first digital camera many young hackers got their hands on. Around the time Sony Mavica cameras were shoving VGA resolution pictures onto floppy drives, the Game Boy camera was snapping 256 224 resolution pictures and displaying them on a 190 144 resolution display. The picture quality was terrible, but [Roland Meertens] recently had an idea. Why not use neural networks to turn these Game Boy Camera pictures into photorealistic images? Neural networks, deep learning, machine learning, or whatever other buzzwords we're using require training data.


Basics of Deep Learning: No Math Required

#artificialintelligence

Recently deep learning has shattered all records when it comes to machine learning. In this short talk you will gain a basic understanding of the two simplest types of layers: the dense, and convolutional layer. Roland Meertens is Machine Learning Engineer at Autonomous Intelligent Driving. This video was recorded at QCon.ai 2018: https://bit.ly/2piRtLl The InfoQ Architects' Newsletter is your monthly guide to all the topics, technologies and techniques that every professional or aspiring software architect needs to know about.


Papers in Production Lightning Talks

#artificialintelligence

Shoup: I'm going to share very little of my personal knowledge, in fact, none of it, but I'm going to talk about a cool paper that I really like. Then Gwen [Shapira] is going to talk about another cool paper and Roland [Meertens] is going to talk about yet another cool paper. The one I want to talk about is a paper that's around using machine learning to do database indexing better. This is a picture of my bookshelf at home. A while ago, I bought myself a box set of "The Art of Computer Programming", which has basically all of computer science algorithms written by or assembled by Don Knuth. There's 4a, so he's still working on completing the thing, hopefully, that will happen. When we're choosing a data structure, typically we're choosing it in this way, we are trying to look for time complexity, how fast is it going to run, and space complexity, how big is it going to be? We typically evaluate those things asymptotically, we're not looking as much at real-world workloads, but looking at what are the complexity characteristics of this thing at the limit when things get very large? We're also, and this is critical, looking at those things without having seen the data and without having seen typically the usage pattern. We're doing is we're saying what is the least worst time and space complexity, given an arbitrary data distribution and an arbitrary usage pattern? It seems like we could do a little better than that, that's what this paper is about. What we'd like to be able to ask or to be able to answer is how could we achieve the best time/space complexity given a specific real-world data distribution and a specific real-world usage pattern.