Tesla, under pressure to show it can generate profits on its main business of making electric cars, on Monday trumpeted a custom-designed computer chip to let its vehicles drive themselves. Even with the new chip -- which comes with all new vehicles and can be installed in older ones -- Teslas still aren't yet fully capable of driving without human intervention. They now have "all hardware necessary," said Elon Musk, Tesla's chief executive officer. "All you have to do is improve the software." The software will be updated over the air to allow full self-driving by the end of the year, he said.
Carmakers and tech companies are in a race to put autonomous vehicles on the road, and it's time for regulators to tap the brakes. This month the National Highway Traffic Safety Administration revealed that it is investigating two crashes involving Tesla vehicles allegedly operating on autopilot. Tesla's autopilot feature is a semi-autonomous system that uses cameras, radar and sensors to steer the car, change lanes, adjust speed and even find a parking space and parallel park. It's not supposed to turn a Tesla sedan into a self-driving car, but there's ample evidence on YouTube of people driving with their hands off the steering wheel, playing games and even climbing into the back seat while their car is hurtling down a freeway. You and your daughter are riding in a driverless car along Pacific Coast Highway.
Elon Musk is no stranger to bold predictions, and on Tuesday, he lobbed another one at self-driving tech doubters: The Tesla CEO said the electric carmaker's full self-driving feature will be completed by the end of 2019. And by the end of 2020, he added, it will be so capable, you'll be able to snooze in the driver seat while it takes you from your parking lot to wherever you're going. "I think we will be'feature complete' on full self-driving this year, meaning the car will be able to find you in a parking lot, pick you up, take you all the way to your destination without an intervention this year," Musk said during a podcast interview with the money management firm ARK Invest, which is a Tesla investor. "I am certain of that. That is not a question mark."
But both incidents have a troubling link, autonomous-vehicle specialists say: a human was at the wheel and could have taken control. "This is what I've called the mushy middle of automation," said Bryant Walker Smith, a University of South Carolina assistant professor of law and specialist on autonomous cars, referring to vehicles with some automation but still a driver at the wheel. "There will certainly be more incidents," he said. "It's dangerous when people feel safer than they actually are." Auto makers are gradually rolling out partially automated systems that pass control back and forth between vehicle and driver.
Advocates of driverless cars worry that the fatal crash of a Tesla Motors Inc. vehicle in self-driving mode will provoke additional regulatory oversight and slow deployment on U.S. roads of the rapidly advancing technology. The National Highway and Traffic Safety Administration aims this month to release a framework for regulating self-driving cars, which could include requiring auto makers to win approval for their technologies before releasing them. That sort of approval process wasn't applied to Tesla's Autopilot system to enable hands-free driving on highways, which the electric-car maker made available on Tesla vehicles via a software update in October. Regulators said Thursday that an Ohio man was using Autopilot when his Tesla Model S crashed into a 18-wheel truck in Florida on May 7, killing him. "There will be repercussions" in regulations, said Dean Pomerleau, a Carnegie Mellon University professor who has worked on driverless cars for 25 years and led several NHTSA research programs.