Goto

Collaborating Authors

Tesla's Autopilot is under federal investigation following crashes

Engadget

The US National Highway Traffic Safety Administration (NHTSA) has initiated an investigation of Tesla's Autopilot system. The probe follows 11 crashes with parked first responder vehicles since 2018, which resulted in 17 injuries and one death. "Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones," the NHTSA's Office of Defects Investigation (ODI) wrote in a document detailing the investigation. "The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes." That covers around 765,000 Tesla electric vehicles, as Bloomberg notes.


Tesla and Honda report over 350 crashes involving advanced driving assistance systems

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. Tesla Inc reported 273 vehicle crashes involving advanced driving assistance systems like Autopilot since July, while Honda Motor identified 90, data from U.S. auto safety regulators released on Wednesday showed. The companies made the disclosures to the National Highway Traffic Safety Administration (NHTSA) after the regulator issued an order in June 2021 requiring automakers and tech companies to immediately report all crashes involving advanced driver assistance systems (ADAS) and vehicles equipped with automated driving systems being tested on public roads. Of the 392 total crashes involving ADAS reported by a dozen automakers, six deaths were reported and five had serious injuries.


Feds probe Tesla Autopilot in Newport Beach crash that killed 3

Los Angeles Times

Federal authorities are investigating whether a Tesla involved in a crash that left three people dead and three others injured last week in Newport Beach had its Autopilot system activated at the time of the wreck. A special crash investigation team was sent for the May 12 incident on Pacific Coast Highway, the National Highway Traffic Safety Administration said Wednesday. In that crash, Newport Beach police were called around 12:45 a.m. to the 3000 block of Pacific Coast Highway, where they found a 2022 Tesla Model S sedan had crashed into a curb and hit construction equipment. Three people were found dead in the Tesla; they were identified last week as Crystal McCallum, 34, of Texas; Andrew James Chaves, 32, of Arizona; and Wayne Walter Swanson Jr., 40, of Newport Beach, according to the Orange County Sheriff's Department. Three construction workers suffered non-life-threatening injuries, police said, adding that the department's Major Accident Investigation Team had been brought in.


Tesla drives on Autopilot through a regulatory gray zone

The Japan Times

BERKELEY, California – The fatal crash of a Tesla with no one apparently behind the wheel has cast a new light on the safety of semiautonomous vehicles and the nebulous U.S. regulatory terrain they navigate. Police in Harris County, Texas, said a Tesla Model S smashed into a tree on Saturday at high speed after failing to negotiate a bend and burst into flames, killing one occupant found in the front passenger seat and the owner in the back seat. Tesla Chief Executive Elon Musk tweeted on Monday that preliminary data downloaded by Tesla indicate the vehicle was not operating on Autopilot, and was not part of the automaker's "Full Self-Driving" (FSD) system. Tesla's Autopilot and FSD, as well as the growing number of similar semi-autonomous driving functions in cars made by other automakers, present a challenge to officials responsible for motor vehicle and highway safety. U.S. federal road safety authority, the National Highway Traffic Safety Administration (NHTSA), has yet to issue specific regulations or performance standards for semi-autonomous systems such as Autopilot, or fully autonomous vehicles (AVs).


NHTSA asks Tesla to turn over crash data on vehicles with Autopilot

#artificialintelligence

The agency's reason is simple: it wants to ascertain whether the Autopilot mechanism has a safety defect that causes Tesla vehicles to hit emergency vehicles. In its letter to Tesla, the agency said it will "assess the technologies and methods used to monitor, assist, and enforce the driver's engagement with the dynamic driving task during Autopilot operation. The investigation will additionally assess the [object and event detection and response feature] by vehicles when engaged in Autopilot mode, and [operational design domain] in which the Autopilot mode is functional. The investigation will also include examination of the contributing circumstances for the confirmed crashes ... and other similar crashes." While the NHTSA didn't threaten Tesla with the possibility of a recall, it does have that power.