If AI Is Like Fire, Let's Not Get Left With Its Ashes
Nor can AI explain how it reaches its conclusions. Like a lazy middle school student, even when the machine gets the right answer, it rarely shows its work, making it harder for humans to trust its methods. Worse still, this opacity can hide the instances when AI systems optimize for a goal that is not quite what their human creators had in mind. For example, one system designed to detect pneumonia in chest X-rays discovered that X-rays from one hospital were more likely than others to exhibit pneumonia because that hospital usually had sicker patients. The machine learned to look for the X-ray's hospital of origin rather than at the X-ray itself.
Jul-29-2022, 05:15:05 GMT
- Industry:
- Education > Educational Setting
- K-12 Education > Middle School (0.64)
- Health & Medicine > Therapeutic Area (1.00)
- Education > Educational Setting
- Technology: