The Securities and Exchange Commission is investigating whether Tesla Motors Inc. TSLA 3.69 % breached securities laws by failing to disclose a fatal crash in May involving an electric car that was driving itself, a person familiar with the matter said, heightening scrutiny of how the Silicon Valley company handled the information. The May 7 accident killed the driver, Joshua Brown, a 40-year old Tesla owner who collided with an 18-wheel semi-truck that pulled in front of him on a Florida highway. Tesla alerted the National Highway Traffic Safety Administration, the U.S. car-safety regulator, to the crash and investigated to determine whether the car was using the company's Autopilot system, which lets cars drive themselves under certain circumstances. But Tesla didn't disclose the crash to investors in a securities filing. The car-safety agency opened an investigation into the Autopilot technology.
SAN FRANCISCO/WASHINGTON - Tesla Motors Co Chief Executive Elon Musk said on Sunday the automaker was updating its semi-autonomous driving system Autopilot with new limits on hands-off driving and other improvements that likely would have prevented a fatality in May. Musk said the update, which will be available within a week or two through an "over-the-air" software update, would rely foremost on radar to give Tesla's electric luxury cars a better sense of what is around them and when to brake. New restrictions of Autopilot 8.0 are a nod to widespread concerns that the system lulled users into a false sense of security through its "hands-off" driving capability. The updated system now will temporarily prevent drivers from using the system if they do not respond to audible warnings to take back control of the car. "We're making much more effective use of radar," Musk told journalists on a phone call.
This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE. Only recently, Tesla Motors revealed that one of its self-driving cars, operating in Autopilot mode, had crashed in May and killed its driver. How much responsibility Tesla has for the death is still under debate, but many experts are already reminding us of the huge number of lives that could be saved by autonomous cars. Does that mean we shouldn't worry much about the single death--that we should look away for the sake of the greater good?
The U.S. announced Thursday the first fatality in a wreck involving a car in self-driving mode. The government said it is investigating the design and performance of the system aboard the Tesla Model S sedan. Model S with Autopilot engaged. The Ohio man who died while using the "Autopilot" feature on his Tesla electric car was watching Harry Potter when he was fatally injured in a wreck while the car was in self-drive mood, according to a witness. Joshua Brown, 40, of Canton, Ohio, died from injuries he sustained when a tractor-trailer made a left turn in front of his 2015 Tesla on a highway near Williston, Fla., in May.
On 7 May, a Tesla Model S was involved in a fatal accident in Florida. At the time of the accident, the vehicle was driving itself, using its Autopilot system. The system didn't stop for a tractor-trailer attempting to turn across a divided highway, and the Tesla collided with the trailer. In a statement, Tesla Motors said this is the "first known fatality in just over 130 million miles [210 million km] where Autopilot was activated" and suggested that this ratio makes the Autopilot safer than an average vehicle. Early this year, Tesla CEO Elon Musk told reporters that the Autopilot system in the Model S was "probably better than a person right now."