Results


NTSB Says Tesla Bears Some Blame for Deadly Autopilot Crash

WIRED

The National Highway Traffic Safety Administration, the government's vehicle safety watchdog, concluded in January that because Brown was supposed to be monitoring the car's driving, human error--not Tesla tech--caused the crash. Tuesday morning, the National Transportation Safety Board, an independent federal body that investigates plane, train, and vehicle crashes, concluded its investigation into the incident. Systems like Tesla's Autopilot, General Motors' Super Cruise, and Audi's Traffic Jam Pilot already make driving safer, according to preliminary research. NHTSA's investigation of the Brown crash found that Tesla cars with self-driving capabilities crashed 40 percent less frequently than those without.


What caused fatal Tesla crash?

FOX News

An investigation by the National Transportation Safety Board (NTSB) has determined that "operational limitations" of Tesla's Autopilot system played a "major role" in a fatal crash last May, but that the driver was also at fault for not paying adequate attention to the road. At the time, Autopilot was capable of steering the car within its lane and autonomously braking for vehicles in the road ahead. His last action was setting the cruise control at 74 mph on the 65 mph road, two minutes before the collision. The NTSB report was issued on the same day that U.S. Transportation Secretary Elaine Chao revealed the federal government's latest voluntary guidelines for autonomous technology, which includes a section on driver monitoring and the transfer of control from vehicle to operator when a system determines that human interaction is required.


la-fi-hy-tesla-autopilot-20170912-story.html

Los Angeles Times

"The Tesla's automation did not detect, nor was it required [to], nor was it designed to detect the crossing vehicle," Robert L. Sumwalt, chairman of the National Transportation Safety Board, said at the start of a hearing reviewing the Florida crash. Tests by the National Highway Traffic Safety Administration determined that Tesla and other vehicles with semiautonomous driving technology had great difficulty sensing cross traffic. The NTSB staff also said that Tesla's reliance on sensing a driver's hands on the wheel was not an effective way of monitoring whether the driver was paying attention. The NTSB staff recommended the use of a more effective technology to determine whether a driver is paying attention, such as a camera tracking the driver's eyes.


NTSB: Tesla Autopilot 'limitations played a major role' in deadly crash

USATODAY

The National Transportation Safety Board says the car company is not at fault. National Transportation Safety Board chair Robert Sumwalt said the Tesla vehicle's "operational limitations played a major role in this collision." His statement came at the beginning of a hearing where the NTSB is expected to rule on whether the Autopilot system on Ohio resident Joshua Brown's Tesla Model S should be blamed for the Florida crash that killed him. Joshua Brown didn't keep his hands on the wheel, despite repeated vehicle warnings, according to the National Transportation Safety Board.


'We've got to start calling Elon Musk on his s***': Uber CEO urged to tackle rival over self-driving car claims

The Independent

The former engineer at the centre of Uber's self-driving car legal troubles has urged ex-Chief Executive Officer Travis Kalanick to criticise Tesla's Elon Musk and several of his claims about autonomous vehicles. "We've got to start calling Elon on his shit," Levandowski wrote in the texts, which were turned over by lawyers for Kalanick. Weeks before the text messages, Tesla unveiled an update to its Autopilot driver-assistance system that uses radar and a GPS database to guide its vehicles. In another text sent days earlier, Levandowski sent Kalanick a link to a video on Sina.com showing a fatal accident that the Chinese news outlet said involved Autopilot.


Teaching Drones How To Crash Safely

MIT Technology Review

The system stores a database of potential ditch sites for safe emergency landings, and is able to choose the ideal site based on range, size, type of terrain, reliability, and time or day constraints. It's a much more advanced system than what is currently used in most commercial UAVs, which require a designated "home" point, to which the vehicle will attempt to return in the case of a hardware malfunction or drained battery. Current models are unable to safely ditch if, for example, the remaining battery charge is unable to return the drone to its home point, or if that home point is out of date. Once these remaining technological challenges are solved, Roy believes that Safe2Ditch, or similar systems, could become an FAA-mandated safety standard in UAV manufacturing.


Wanna Help Self-Driving Cars? Turn on Your Phone's Camera

WIRED

The problem is, those maps need a level of detail that goes way beyond what you'll find on any phone screen or atlas: Not just street names and which roads are one-way, but the exact location of every stop sign, traffic light, lane line, and curb, down to the centimeter. The camera's feed gets uploaded to Lvl5's central hub (via Wi-Fi by default), where the startup's software picks out key features--traffic lights, lane lines, stop signs--and builds the map. But now the company is running pilot projects with a few automakers (whom Kouri declined to name), and expects that once it has official partnerships in place, it can use those companies' vehicles' built-in cameras to collect way more data, without worrying about onboarding and paying individual drivers. New owners get an onscreen message: "We need to collect short video clips using the car's external cameras to learn how to recognize things like lane lines, street signs, and traffic light positions.


Five hurt when Tesla on autopilot suddenly accelerated

Daily Mail

A driver and his four passengers were hurt when his Tesla suddenly accelerated while on autopilot, careened off the road and flipped into a marsh. He told police: 'When he engaged the auto pilot (sic) feature, that the vehicle suddenly accelerated causing the car to leave the roadway and overturn.' 'We have not yet established whether the vehicle's Autopilot feature was activated, and have no reason to believe that Autopilot, which has been found by NHTSA to reduce accident rates by 40 percent, worked other than as designed. 'Every time a driver engages Autopilot, they are reminded of their responsibility to remain engaged and to be prepared to take immediate action at all times, and drivers must acknowledge their responsibility to do so before Autopilot is enabled.'


Audi's A8 is a self-driving car you can buy in 2017

Mashable

Audi has an upcoming luxury sedan, the A8, which will come with Level 3 autonomy, meaning the car can drive itself... up to a certain speed. The new A8 -- which has the top spot in Audi's lineup of luxury sedans -- is so full of technology that merely listing it all is a daunting task. Tesla drivers should never fully remove their hands from the wheel, and its Autopilot system works only in certain situations, such as driving on a freeway. Audi says it will introduce the various traffic jam pilot features gradually, so even buyers self-driving-friendly states like Nevada may not be able to fully enjoy all that self-driving goodness right away.


Tesla hires AI expert to help lead team in charge of self-driving software

#artificialintelligence

Tesla Inc. has hired a Stanford University computer scientist specializing in artificial intelligence and deep learning to lead its efforts around driverless cars. Karpathy is "one of the world's leading experts in computer vision and deep learning," the spokesperson said. Apple's CEO Tim Cook recently confirmed the company's efforts around what he called "autonomous systems," and called driverless cars "the mother of all AI projects." The hire comes as Tesla's lead of Autopilot software, Chris Lattner, earlier this week announced he was leaving the company after six months on the job.