On 7 May, a Tesla Model S was involved in a fatal accident in Florida. At the time of the accident, the vehicle was driving itself, using its Autopilot system. The system didn't stop for a tractor-trailer attempting to turn across a divided highway, and the Tesla collided with the trailer. In a statement, Tesla Motors said this is the "first known fatality in just over 130 million miles [210 million km] where Autopilot was activated" and suggested that this ratio makes the Autopilot safer than an average vehicle. Early this year, Tesla CEO Elon Musk told reporters that the Autopilot system in the Model S was "probably better than a person right now."
It is the year 2023, and for the first time, a self-driving car navigating city streets strikes and kills a pedestrian. A lawsuit is sure to follow. But exactly what laws will apply? Today, the law is scrambling to keep up with the technology, which is moving forward at a breakneck pace, thanks to efforts by Apple, Audi, BMW, Ford [pdf], General Motors, Google, Honda, Mercedes, Nissan, Nvidia, Tesla, Toyota, and Volkswagen. Google's prototype self-driving cars, with test drivers always ready to take control, are already on city streets in Mountain View, Calif., and Austin, Texas. In the second half of 2015, Tesla Motors began allowing owners (not just test drivers) to switch on its Autopilot mode.