Push for AI innovation can create dangerous products

#artificialintelligence 

This past June, the U.S. National Highway Traffic Safety Administration announced a probe into Tesla's autopilot software. Data gathered from 16 crashes raised concerns over the possibility that Tesla's AI may be programmed to quit when a crash is imminent. This way, the car's driver, not the manufacturer, would be legally liable at the moment of impact. It echoes the revelation that Uber's self-driving car, which hit and killed a woman, detected her six seconds before impact. But the AI was not programmed to recognize pedestrians outside of designated crosswalks.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found