For California state officials, the new federal guidelines on testing and deployment of driverless cars come as a bit of a relief. Until this week, the absence of U.S. government guidance had left the state Department of Motor Vehicles -- generally in charge of registering vehicles and issuing drivers' licenses -- to take the lead role in drafting regulations to ensure the safety of self-driving vehicles. Though the federal guidelines issued Tuesday are short on specifics, the Department of Transportation will take responsibility for regulating the driving hardware and software, and it has devised a model state policy that probably will take the pressure off individual state agencies. That policy, issued jointly by the Department of Transportation and the National Highway Traffic Safety Administration, could result in changes to current California draft regulations on autonomous vehicles. "You can imagine how the California DMV would be struggling, with no technological background or engineers at their disposal, trying to figure out whether a particular autonomous vehicle is or is not safe enough to be deployed," said Robert Peterson, a law professor at Santa Clara University.
U.S. vehicle safety regulators have said the artificial intelligence system piloting a self-driving Google car could be considered the driver under federal law, a major step toward ultimately winning approval for autonomous vehicles on the roads. The National Highway Traffic Safety Administration told Google, a unit of Alphabet Inc, of its decision in a previously unreported Feb. 4 letter to the company posted on the agency's website this week. Google's self-driving car unit on Nov. 12 submitted a proposed design for a self-driving car that has'no need for a human driver,' the letter to Google from National Highway Traffic Safety Administration Chief Counsel Paul Hemmersbaugh said. At a Senate hearing, representatives of General Motors and Delphi touted numerous safety and environmental benefits of autonomous vehicles. In January, the US National Highway Traffic Safety Administration (NHTSA) said it may waive some vehicle safety rules to allow more driverless cars to operate on US roads.
Alex Khizhniak, director of Technical Evangelism at IT services provider Altros, stated, "Being connected to other cars on the road will eventually make driving much safer. Combined with predictive analysis, smart systems could substitute for a driver in case of emergency. Although these technologies are still developing - and some legislations should also be introduced- the future looks promising for self-driving and intelligent driving assistants." While many have been vocal about their concerns regarding the regulation of autonomous or connected cars, there are many advantages that must be considered before delving into the risks. One of the many key benefits of connected cars is that they could contribute to safer traffic patterns in cities with congestion issues as a consequence of rapid urbanization.
ANN ARBOR, MICHIGAN – The Trump administration on Tuesday unveiled updated safety guidelines for self-driving cars aimed at clearing barriers for automakers and tech companies wanting to get test vehicles on the road. The new voluntary guidelines announced by U.S. Transportation Secretary Elaine Chao update policies issued last fall by the Obama administration, which were also largely voluntary. Chao emphasized that the guidelines aren't meant to force automakers to use certain technology or meet stringent requirements. Instead, they're designed to clarify what vehicle developers and states should consider as more test cars reach public roads. "We want to make sure those who are involved understand how important safety is," Chao said during a visit to an autonomous vehicle testing facility at the University of Michigan.
BERKELEY, California – The fatal crash of a Tesla with no one apparently behind the wheel has cast a new light on the safety of semiautonomous vehicles and the nebulous U.S. regulatory terrain they navigate. Police in Harris County, Texas, said a Tesla Model S smashed into a tree on Saturday at high speed after failing to negotiate a bend and burst into flames, killing one occupant found in the front passenger seat and the owner in the back seat. Tesla Chief Executive Elon Musk tweeted on Monday that preliminary data downloaded by Tesla indicate the vehicle was not operating on Autopilot, and was not part of the automaker's "Full Self-Driving" (FSD) system. Tesla's Autopilot and FSD, as well as the growing number of similar semi-autonomous driving functions in cars made by other automakers, present a challenge to officials responsible for motor vehicle and highway safety. U.S. federal road safety authority, the National Highway Traffic Safety Administration (NHTSA), has yet to issue specific regulations or performance standards for semi-autonomous systems such as Autopilot, or fully autonomous vehicles (AVs).