Uber and Volvo announced an agreement where Uber will buy, in time, up to 24,000 specially built Volvo XC90s which will run Uber's self-driving software and, presumably, offer rides to Uber customers. While the rides are some time away, people have made note of this for several reasons. I'm not clear who originally said it -- I first heard it from Marc Andreesen -- but "the truest form of a partnership is called a purchase order." In spite of the scores of announced partnerships and joint ventures announced to get PR in the robocar space, this is a big deal, but it's a sign of the sort of deal car makers have been afraid of. Volvo will be primarily a contract manufacturer here, and Uber will own the special sauce that makes the vehicle work, and it will own the customer.
Governor Andrew Cuomo of the State of New York declared last month that New York City will join 13 other states in testing self-driving cars: "Autonomous vehicles have the potential to save time and save lives, and we are proud to be working with GM and Cruise on the future of this exciting new technology." For General Motors, this represents a major milestone in the development of its Cruise software, since the the knowledge gained on Manhattan's busy streets will be invaluable in accelerating its deep learning technology. In the spirit of one-upmanship, Waymo went one step further by declaring this week that it will be the first car company in the world to ferry passengers completely autonomously (without human engineers safeguarding the wheel). As unmanned systems are speeding ahead toward consumer adoption, one challenge that Cruise, Waymo and others may counter within the busy canyons of urban centers is the loss of Global Positioning System (GPS) satellite data. Robots require a complex suite of coordinating data systems that bounce between orbiting satellites to provide positioning and communication links to accurately navigate our world.
In this interview, Gerdes discusses developing a model for high-performance control of a vehicle; their autonomous race car, an Audi TTS named'Shelley,' and how its autonomous performance compares to ameteur and professional race car drivers; and an autonomous, drifting Delorean named'MARTY.' Chris Gerdes is a Professor of Mechanical Engineering at Stanford University, Director of the Center for Automotive Research at Stanford (CARS) and Director of the Revs Program at Stanford. His laboratory studies how cars move, how humans drive cars and how to design future cars that work cooperatively with the driver or drive themselves. When not teaching on campus, he can often be found at the racetrack with students, instrumenting historic race cars or trying out their latest prototypes for the future.
The proposed regulations preempt state regulation of vehicle design, and allow companies to apply for high volume exemptions from the standards that exist for human-driven cars. There is a new research area known as "explainable AI" which hopes to bridge this gap and make it possible to document and understand why machine learning systems operate as they do. The most interesting proposal in the prior document was a requirement for public sharing of incident and crash data so that all teams could learn from every problem any team encounters. The new document calls for a standard data format, and makes general motherhood calls for storing data in a crash, something everybody already does.
An immediately positive thing is the potential ability for private robocars to, once they have taken their owners to safety, drive back into the evacuation zone as temporary fleet cars, and fetch other people, starting with those selected by the car's owner, but also members of the public needing assistance. Cars might ferry people from homes to stations where robotic buses (including those from other cities, and human driven buses) could carry lots of people. The good thing is, if you can imagine it, so can the teams building test systems for robocars. If the data networks are up, they could get information in real time on road problems and disaster situations.
For almost any proposal I have seen for how we might make infrastructure "robocar ready" there is a far cheaper and faster-to-develop solution that involves having the cars get smarter. Indeed, almost all the activity of infrastructure maintainers should focus on maintaining the virtual infrastructure instead. They should work to make sure roads are changed without logging it in a database, that road signs are all logged in databases and new ones don't go into force until logged in the databases. The budget size of many of the EU and Japanese funded projects for example, far exceeded the budget of Google's early efforts, yet Google produced an impressive car while the EU projects produced only minor results.
In the race to develop self-driving technology, Chinese Internet giant Baidu unveiled its 50 partners in an open source development program, revised its timeline for introducing autonomous driving capabilities on open city roads, described the Project Apollo consortium and its goals, and declared Apollo to be the'Android of the autonomous driving industry'. It will start test-driving in restricted environments immediately, before gradually introducing fully autonomous driving capabilities on highways and open city roads by 2020. Baidu's goal is to get those vehicles on the roads in China, the world's biggest auto market, with the hope that the same technology, embedded in exported Chinese vehicles, can then conquer the United States. China has set a goal for 10 to 20 percent of vehicles to be highly autonomous by 2025, and for 10 percent of cars to be fully self-driving in 2030 and Baidu wants to provide the technology to get those vehicles on the roads in China with the hope that the same technology, embedded in exported Chinese vehicles, can then conquer the United States.
Hansman and Hoburg are co-instructors for MIT's Beaver Works project, a student research collaboration between MIT and the MIT Lincoln Laboratory. Hansman and Hoburg worked with MIT students to design a long-duration UAV as part of a Beaver Works capstone project -- typically a two- or three-semester course that allows MIT students to design a vehicle that meets certain mission specifications, and to build and test their design. The researchers came to their conclusions after modeling the problem using GPkit, a software tool developed by Hoburg that allows engineers to determine the optimal design decisions or dimensions for a vehicle, given certain constraints or mission requirements. In the fall of 2016, the team built a prototype UAV, following the dimensions determined by students using Hoburg's software tool.
Nathan is a Reader in the Department of Computer Science at the University of Warwick, whose research into the application of machine learning for autonomous vehicles (or "driverless cars") has been supported by a Royal Society University Research Fellowship. My research uses machine learning to give insights into how objects or people interact and how patterns emerge and evolve. Machine learning algorithms will examine previous behaviours and learn from these behaviours, to then predict what will happen in the future. An accurate algorithm could then be used to inform the decisions vehicles make and predict vehicle journeys and routes.
We are use to thinking that a single UAV will only transport a single small box. In this scenario, UAV cooperative teams could play a key role in the industry. Cooperative transportation systems require a specific control and path planning strategy compared to single robots. Deformable solid linking among flying vehicles is not a new phenomenon: aerial refuelling already exists, but with manned vehicles.