Well File:


AI robots learning racism, sexism and other prejudices from humans, study finds

The Independent - Tech

Artificially intelligent robots and devices are being taught to be racist, sexist and otherwise prejudiced by learning from humans, according to new research. A massive study of millions of words online looked at how closely different terms were to each other in the text – the same way that automatic translators use "machine learning" to establish what language means. The researchers found male names were more closely associated with career-related terms than female ones, which were more closely associated with words related to the family. This link was stronger than the non-controversial findings that musical instruments and flowers were pleasant and weapons and insects were unpleasant. Female names were also strongly associated with artistic terms, while male names were found to be closer to maths and science ones.