Goto

Collaborating Authors

 tesla autopilot crash


Federal investigators step up probe into Tesla Autopilot crashes

Washington Post - Technology News

The National Highway Traffic Safety Administration said this week it is upgrading a preliminary evaluation of the issue into an engineering analysis, a step that will let it better explore Autopilot's potential role in the crashes and a potential precursor to a recall. NHTSA is seeking to determine whether Autopilot undermines "the effectiveness of the driver's supervision," according to documents describing its analysis.


Man behind wheel in Tesla Autopilot crash that killed two charged with vehicular manslaughter in first case of its kind

The Independent - Tech

A California motorist has become the first person to be charged over a fatal crash involving Tesla's Autopilot system. Kevin George Aziz Riad, 27, faces two counts of vehicular manslaughter after being behind the wheel of a Tesla when it ran a red light, crashing into another car and killing two people. It is the first time a motorist has been charged with a felony for an incident involving the electric car maker's partially automated driving system, according to the Associated Press. Los Angeles County prosecutors filed the charges in October, but details of the case have only just emerged. Mr Riad, who works as a limousine service driver, is out on bail while the case is pending.


California driver charged with felony manslaughter in Tesla Autopilot crash

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. California prosecutors have filed two counts of vehicular manslaughter against the driver of a Tesla on Autopilot who ran a red light, slammed into another car and killed two people in 2019. All Tesla models, including the Model S, now come standard with Autopilot. The defendant appears to be the first person to be charged with a felony in the United States for a fatal crash involving a motorist who was using a partially automated driving system.


Tesla Autopilot Crash: Why We Should Worry About a Single Death

IEEE Spectrum Robotics

This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE. Only recently, Tesla Motors revealed that one of its self-driving cars, operating in Autopilot mode, had crashed in May and killed its driver. How much responsibility Tesla has for the death is still under debate, but many experts are already reminding us of the huge number of lives that could be saved by autonomous cars. Does that mean we shouldn't worry much about the single death--that we should look away for the sake of the greater good? Is it unethical to focus on negative things that could slow down autonomous-driving technology, which could mean letting thousands of people die in traffic accidents?


The Tesla Autopilot crash is 'a blip on the radar' -- self-driving technology is here to stay

#artificialintelligence

Tesla is one of those companies that people love to love. The electric-car maker's story is one of innovation and genius, with a dose of erudite bravado coming from its intrepid CEO, Elon Musk. Musk has championed Tesla's technologies, including the driver-assist feature called Autopilot. Much has been said about Autopilot's virtues -- its ability to keep the car in one lane, avoid collisions and use cameras and radar to detect its surroundings -- but the technology is not perfect. Japanese automakers, by comparison, are unwilling to follow Tesla's aggressive strategy of getting such features into drivers' hands quickly.


Tesla autopilot crash: Fatal collision was tragic but self-driving technology should still continue, say experts

The Independent - Tech

Experts have come out in defence of automated driving technology after a driver was killed while using his Tesla's autopilot feature. Specialists in the fields of artificial intelligence, engineering, and transport have said that while the death was tragic, it should not prevent the software from being developed. Joshua Brown, 40, died when his Tesla Model S went underneath the trailer of a lorry that had turned left in front of him on a Florida road in May, prompting an urgent investigation by Tesla itself and the US authorities. In a statement on its blog, Tesla explained that the technology is still under development and that as an assist feature drivers "need to maintain control and responsibility for [their] vehicle' while using it." The company went on to say that the autopilot mode does still result "in a statistically significant improvement in safety."