Goto

Collaborating Authors

Tesla reportedly on autopilot system slams into police car parked on side of highway

USATODAY - Tech Top Stories

A Tesla in Autopilot mode crashed Saturday into a Florida Highway Patrol cruiser parked on the side of the road in Orlando. The crash happened while a federal investigation into Tesla's partially automated driving system is underway after nearly a dozen crashes involving emergency responder vehicles. The official police report from the Florida Highway Patrol states an Orlando man stopped his disabled vehicle in the travel lane of the highway. A 28-year-old trooper parked his patrol vehicle, a 2018 Dodge Charger, directly behind the disabled vehicle and activated the Dodge's emergency lights. The trooper then exited the vehicle to assist the driver. The 2019 Tesla apparently failed to stop and struck the left side of the patrol cruiser, then proceeded to strike the disabled vehicle.


Tesla's Autopilot faces US investigation after crashes with emergency vehicles

The Guardian

The US government has opened a formal investigation into Tesla's Autopilot partially automated driving system after a series of collisions with parked emergency vehicles. The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the US since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration (NHTSA) as part of the investigation, 17 people were injured and one was killed. NHTSA says it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action on Monday in a posting on its website.


Consumer groups ask FTC to probe 'deceptive' Tesla Autopilot ads

Daily Mail - Science & tech

Two U.S. consumer advocacy groups urged the Federal Trade Commission on Wednesday to investigate what they called Tesla Inc's'deceptive and misleading' use of the name Autopilot for its assisted-driving technology. The Center for Auto Safety and Consumer Watchdog, both non-profit groups, sent a letter to the FTC saying that consumers could be misled into thinking, based on Tesla's marketing and advertising, that Autopilot makes a Tesla vehicle self-driving. Autopilot, released in 2015, is an enhanced cruise-control system that partially automates steering and braking. Tesla has said the use of Autopilot results in 40 percent fewer crashes, a claim the U.S. National Highway Traffic Safety Administration repeated in a 2017 report on the first fatality, which occurred in May 2016. Earlier this month, however, the agency said regulators had not assessed the effectiveness of the technology.


Feds probe Tesla Autopilot in Newport Beach crash that killed 3

Los Angeles Times

Federal authorities are investigating whether a Tesla involved in a crash that left three people dead and three others injured last week in Newport Beach had its Autopilot system activated at the time of the wreck. A special crash investigation team was sent for the May 12 incident on Pacific Coast Highway, the National Highway Traffic Safety Administration said Wednesday. In that crash, Newport Beach police were called around 12:45 a.m. to the 3000 block of Pacific Coast Highway, where they found a 2022 Tesla Model S sedan had crashed into a curb and hit construction equipment. Three people were found dead in the Tesla; they were identified last week as Crystal McCallum, 34, of Texas; Andrew James Chaves, 32, of Arizona; and Wayne Walter Swanson Jr., 40, of Newport Beach, according to the Orange County Sheriff's Department. Three construction workers suffered non-life-threatening injuries, police said, adding that the department's Major Accident Investigation Team had been brought in.


Tesla didn't add eye tracking to its cars because it was 'ineffective'

Daily Mail - Science & tech

Tesla's autonomous driving technology is being called into question yet again. In the wake of a handful of fatal crashes, the electric car company reportedly considered adding eye-tracking technology to its self-driving automobiles, as part of an effort to make sure drivers used Autopilot safely and to reduce accidents. But several executives, including CEO Elon Musk, ultimately decided against the technology, due to its costly nature, possible ineffectiveness and potential to annoy drivers, the Wall Street Journal reported, citing sources close to the situation. Tesla reportedly considered adding eye tracking sensors to its vehicles to prevent crashes during Autopilot mode. That's despite developers of Tesla's Autopilot system saying there weren't enough safeguards to make sure drivers remained alert while operating the technology.