How soon will we have access to vehicles that don't require human control? Are driverless cars just around the corner? What will our travel be like if we're spending a lot less time behind the wheel? What technology actually makes autonomous driving possible? What is autonomous driving, anyway, and what do the different levels entail?
U.S. vehicle safety regulators have said the artificial intelligence system piloting a self-driving Google car could be considered the driver under federal law, a major step toward ultimately winning approval for autonomous vehicles on the roads. The National Highway Traffic Safety Administration told Google, a unit of Alphabet Inc, of its decision in a previously unreported Feb. 4 letter to the company posted on the agency's website this week. Google's self-driving car unit on Nov. 12 submitted a proposed design for a self-driving car that has'no need for a human driver,' the letter to Google from National Highway Traffic Safety Administration Chief Counsel Paul Hemmersbaugh said. At a Senate hearing, representatives of General Motors and Delphi touted numerous safety and environmental benefits of autonomous vehicles. In January, the US National Highway Traffic Safety Administration (NHTSA) said it may waive some vehicle safety rules to allow more driverless cars to operate on US roads.
Once the update arrives, Tesla vehicles will be able to drive themselves in a city the way they can perform highway cruising now, the company said. That means interpreting stop signs and traffic lights, making sharp turns, and navigating stop-and-go urban traffic and other obstacles -- a far more difficult task than navigating long, relatively straight stretches of highways. Although Tesla's website has promised features as soon as this year including the ability to recognize and react to traffic lights and stop signs, and what it calls "Automatic driving on city streets," the suite would still require a human driver behind the wheel. As soon as next year, Tesla has said, the cars will be able to operate reliably on their own, even allowing the driver to fall asleep. This tiered approach is different from companies such as Waymo, whose sole aim is to launch autonomous vehicles that do not need a driver behind the wheel.
Last year, a Florida man became the first person to die in a crash involving autonomous driving technology. Forty-year-old Joshua Brown had his hands off the wheel when his car slammed into a semi-trailer making a left turn across his lane. The incident caused considerable consternation in the media, not the least for underlining the glaring absence of autonomous vehicle (AV) regulations then in place. "The fatal crash," the Los Angeles Times said, "highlighted what some say is a gaping pothole on the road to self-driving vehicles: the lack of federal rules." The newspaper had a point.
ANN ARBOR, MICHIGAN – The Trump administration on Tuesday unveiled updated safety guidelines for self-driving cars aimed at clearing barriers for automakers and tech companies wanting to get test vehicles on the road. The new voluntary guidelines announced by U.S. Transportation Secretary Elaine Chao update policies issued last fall by the Obama administration, which were also largely voluntary. Chao emphasized that the guidelines aren't meant to force automakers to use certain technology or meet stringent requirements. Instead, they're designed to clarify what vehicle developers and states should consider as more test cars reach public roads. "We want to make sure those who are involved understand how important safety is," Chao said during a visit to an autonomous vehicle testing facility at the University of Michigan.