SEC Investigating Tesla for Possible Securities-Law Breach

WSJ.com: WSJD - Technology

The Securities and Exchange Commission is investigating whether Tesla Motors Inc. TSLA 3.69 % breached securities laws by failing to disclose a fatal crash in May involving an electric car that was driving itself, a person familiar with the matter said, heightening scrutiny of how the Silicon Valley company handled the information. The May 7 accident killed the driver, Joshua Brown, a 40-year old Tesla owner who collided with an 18-wheel semi-truck that pulled in front of him on a Florida highway. Tesla alerted the National Highway Traffic Safety Administration, the U.S. car-safety regulator, to the crash and investigated to determine whether the car was using the company's Autopilot system, which lets cars drive themselves under certain circumstances. But Tesla didn't disclose the crash to investors in a securities filing. The car-safety agency opened an investigation into the Autopilot technology.


Elon Musk: Tesla Autopilot Update Could Have Prevented Fatal Crash

Huffington Post - Tech news and opinion

SAN FRANCISCO/WASHINGTON - Tesla Motors Co Chief Executive Elon Musk said on Sunday the automaker was updating its semi-autonomous driving system Autopilot with new limits on hands-off driving and other improvements that likely would have prevented a fatality in May. Musk said the update, which will be available within a week or two through an "over-the-air" software update, would rely foremost on radar to give Tesla's electric luxury cars a better sense of what is around them and when to brake. New restrictions of Autopilot 8.0 are a nod to widespread concerns that the system lulled users into a false sense of security through its "hands-off" driving capability. The updated system now will temporarily prevent drivers from using the system if they do not respond to audible warnings to take back control of the car. "We're making much more effective use of radar," Musk told journalists on a phone call.


Google's driverless cars may use human flypaper in road accidents

ZDNet

Google has filed a patent for a "sticky" adhesive coating which would take pedestrians along with a car in the case of an accident. The tech giant's autonomous car project, having been in development for a number of years, has been touted as a means of reducing the rate of human error and fatal accidents on the road. Driverless cars and vehicles with smart technology use sensors, networking and intelligent mapping to keep a vehicle in the right lane, avoid obstacles and park correctly -- and it is hoped that one day little human input will be needed to get from A to B. Despite these technological advancements, accidents still happen, as Google's autonomous car accident statistics show. Considering how many million miles that the prototypes have covered, rates are still low -- with only one collision the fault of Google -- but could a less technological solution reduce the rates of serious injuries further? In a patent filing submitted by the Mountain View, CA-base firm and awarded by the US Patent and Trademark Office (USPTO), a sticky, adhesive layer which coats the front of cars could protect pedestrians if they are hit by a moving vehicle.


The Tiny Startup Trying to Eliminate Self-Driving Car Crashes

#artificialintelligence

Everyone's been there: driving in the pitch dark, attempting to decipher signs, handle sharp turns, and weave through multiple lanes of whizzing traffic. It's a difficult situation even for an experienced human driver--so how can a car that's driving itself pull it off? An autonomous vehicle must know its precise location, the location of other cars around it, the route to its destination, and any possible obstacles in its path. To deliver that information, automakers are turning to technology developed by San Francisco-based startup Civil Maps. The 30-employee company says it can give cars data that's more accurate and more frequently updated than competing self-driving systems.


Tesla Autopilot Crash: Why We Should Worry About a Single Death

IEEE Spectrum Robotics

This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE. Only recently, Tesla Motors revealed that one of its self-driving cars, operating in Autopilot mode, had crashed in May and killed its driver. How much responsibility Tesla has for the death is still under debate, but many experts are already reminding us of the huge number of lives that could be saved by autonomous cars. Does that mean we shouldn't worry much about the single death--that we should look away for the sake of the greater good?