Advent of self-driving autos spurs debate on accident liability in Japan

The Japan Times

More and more self-driving vehicles are making their debut, raising the question of who should be held accountable if, or perhaps when, they cause accidents. Following American and German automakers Tesla Motors Inc. and Mercedes-Benz, Nissan Motor Co. released a minivan model with self-driving functions in the Serena family in August at a time when the government and automakers in Japan are looking to have autonomous vehicles in regular use by 2020. In Japan, autonomous vehicles are now sold with the understanding that drivers are responsible for maintaining control of their vehicles. Drivers are required to stay behind the steering wheel even when self-driving functions are in operation, and they are held accountable for accidents. The autonomous Serena model is designed for expressway use in single-lane traffic.


California axes self-driving car rule limiting liability for crashes

Engadget

California has been happy to tweak the rules to get more self-driving cars on the road, but it still has its limits. The state's DMV has eliminated a planned rule (suggested by GM) that would have let companies avoid liability for an autonomous vehicle crash if the machine hadn't been maintained to manufacturer specs. In other words, they could have been let off the hook if your car's sensors were muddy, even if an accident was really due to bad code.


California may limit liability of self-driving carmakers

Daily Mail - Science & tech

California regulators are embracing a General Motors recommendation that would help makers of self-driving cars avoid paying for accidents and other trouble, raising concerns that the proposal will put an unfair burden on vehicle owners.


Lawyers, Not Ethicists, Will Solve the Robocar 'Trolley Problem'

WIRED

People seem more that a bit freaked out by the trolley problem right now. The 60s-era thought experiment, occasionally pondered with a bong in hand, requires that you imagine a runaway trolley barreling down the tracks toward five people. You stand at a railway switch with the power to divert the trolley to another track, where just one person stands. This ethical exercise takes on new meaning at the dawn of the autonomous age. Given a similar conundrum, does a robocar risk the lives of five pedestrians, or its passengers?


Artificial Intelligence and Robotics: Who's Liable for the decisions made?

#artificialintelligence

Reuters news agency reported on 16th February 2017 that "European lawmakers called...for EU-wide legislation to regulate the rise of robots, including an ethical framework for their development and deployment and the establishment of liability for the actions of robots including self-driving cars." The question of determining'liability' for decision making achieved by robots or artificial intelligence is an interesting and important subject as the implementation of this technology increases in industry, and starts to more directly impact our day to day lives. Indeed, as application of Artificial Intelligence and machine learning technology grows, we are likely to witness how it changes the nature of work, businesses, industries and society. And yet, although it has the power to disrupt and drive greater efficiencies, AI has its obstacles: the issue of'who is liable when something goes awry' being one of them. Like many protagonists in industry, Members of the European Parliament (MEPs) are trying to tackle this liability question.