safety warning
Do AI systems need to come with safety warnings?
Considering how powerful AI systems are, and the roles they increasingly play in helping to make high-stakes decisions about our lives, homes, and societies, they receive surprisingly little formal scrutiny. That's starting to change, thanks to the blossoming field of AI audits. When they work well, these audits allow us to reliably check how well a system is working and figure out how to mitigate any possible bias or harm. Famously, a 2018 audit of commercial facial recognition systems by AI researchers Joy Buolamwini and Timnit Gebru found that the system didn't recognize darker-skinned people as well as white people. For dark-skinned women, the error rate was up to 34%. As AI researcher Abeba Birhane points out in a new essay in Nature, the audit "instigated a body of critical work that has exposed the bias, discrimination, and oppressive nature of facial-analysis algorithms."
- Information Technology (0.33)
- Government (0.33)
Uber allegedly ignored safety warnings before self-driving fatality
Just days after Uber announced its plans to resume testing of its self-driving taxis, new information reveals that a whistleblower had made the company aware of the technology's safety failures before the incident in Arizona last March, which saw a pedestrian struck and killed by one of Uber's vehicles, and which led to the suspension of all testing activity. According to The Information, Robbie Miller, a manager in the testing-operations group, sent a cautionary email to a number of Uber's executive and lawyers, warning that the vehicles were "routinely in accidents resulting in damage. This is usually the result of poor behavior of the operator or the AV technology." It appears the email was prompted by an incident in Pittsburgh, where just a few days before Miller sent the message an Uber prototype swerved completely off the road and onto the sidewalk, where it continued to drive. According to Miller's email, the episode was "essentially ignored" for days, until Miller raised it with other managers.
Tesla driver in fatal Autopilot crash ignored safety warnings
Following the investigation of a fatal Tesla Model S crash, the NTSB concluded in a 500-page report that the driver, Joshua Brown, ignored repeated "Autopilot" warnings to keep his hands on the wheel. "For the vast majority of the trip, the Autopilot hands-on state remained at'hands required, not detected,' " the report states. Specifically, Brown was supposed to have his hands on the wheel for a 37-minute portion of the trip, and did so for just 25 seconds. At the same time, the NTSB appears to have debunked reports from the truck driver involved in the accident that Brown was watching a Harry Potter movie at the time of the crash. "No Harry Potter movie file was found on the hard drive of the [Chromebook] device," it states.
- Transportation > Ground > Road (1.00)
- Government > Regional Government > North America Government > United States Government (0.80)