Anyone who's circled a busy parking lot or city block knows that finding an open spot can be tricky. It all can turn a quick trip to the store into a high-stress ordeal. To park in these environments, autonomous vehicles need a visual perception system that can detect an open spot under a variety of conditions. Perceiving both indoor and outdoor spaces, separated by single, double or faded lane markings, as well as differentiating between occupied, unoccupied and partially obscured spots are key for such a system -- as is doing so under varying lighting conditions. Not every parking space is a perfect rectangle.
I've given a few runs of my presentation on Driverless Cars in #Australia, #Malaysia and #NewZealand. One thing I've noticed is how engaged and excited the audience become when they see a #Tesla Model S, driving itself for the first time. So, what I wanted to achieve in this blog, is to share with you some of this magic and to show you some of the graphics and features I present to the audience. The key to the driverless car is the Tesla Enhanced Autopilot system which provides a number of different automatic features. Basically, new features are added as software upgrades, with the Autopilot hardware already set to support years of automation ahead.
It's rare that we get extended footage of autonomous cars driving in a a series of situations. Nvidia, the technology company that makes graphics cards and computers for autonomous car development, published a paper detailing a new system of deep learning that teaches itself how to drive with "minimum training data from humans." The paper also included video footage, of which we are pleased to present the highlights. The algorithm learned to detect roadways itself, based on examples of driving provided by researchers. It also is able to classify roads so well that it can still drive when there aren't painted lines, which is how many autonomous systems judge the edge of the road now.
Nvidia's automotive ambitions seemed targeted solely on creating a platform to enable fully autonomous vehicles, notably the robotaxis that so many companies hope to deploy in the coming decade. It turns out that Nvidia has also been working a more near-term product that opens it up to a different segment in the automotive industry. The company announced Monday at CES 2019 that it has launched Nvidia Drive AutoPilot, a reference platform that automakers can use to bring more sophisticated automated driving features into their production vehicles. This is not a self-driving car product, although it will likely be misinterpreted as such. The Drive AutoPilot system is meant to make those advanced driver assistance system in today's cars even better.
The race between two key chip makers to put self-driving cars on the streets is getting heated. And it's no less entertaining, involving the likes of BB8, a self-driving Nvidia car named after a droid in Star Wars, and the company's automotive supercomputer called Xavier, named after an X-men superhero. These two playfully named products are a big part of Nvidia's ambitious plans to put a fleet of self-driving cars on the streets by 2020. Nvidia is collaborating with Audi to develop these autonomous cars, which will be based on the Drive-PX computer. Nvidia's announcement comes just a day after Intel and BMW said they were putting 40 self-driving cars on the street by the end of this year.