I think, therefore I code
To most of us, a 3-D-printed turtle just looks like a turtle; four legs, patterned skin, and a shell. But if you show it to a particular computer in a certain way, that object's not a turtle -- it's a gun. Objects or images that can fool artificial intelligence like this are called adversarial examples. Jessy Lin, a senior double-majoring in computer science and electrical engineering and in philosophy, believes that they're a serious problem, with the potential to trip up AI systems involved in driverless cars, facial recognition, or other applications. She and several other MIT students have formed a research group called LabSix, which creates examples of these AI adversaries in real-world settings -- such as the turtle identified as a rifle -- to show that they are legitimate concerns.
Aug-20-2019, 16:19:51 GMT
- Country:
- Europe > Holy See (0.05)
- North America > United States
- Massachusetts > Middlesex County
- Cambridge (0.40)
- New York (0.06)
- Massachusetts > Middlesex County
- Industry:
- Education
- Educational Setting > K-12 Education (0.34)
- Operations > Student Enrollment (0.40)
- Education
- Technology: