Algorithmic bias--when seemingly innocuous programming takes on the prejudices either of its creators or the data it is fed--causes everything from warped Google searches to barring qualified women from medical school. Tay's embrace of humanity's worst attributes is an example of algorithmic bias--when seemingly innocuous programming takes on the prejudices either of its creators or the data it is fed. Recently, a Carnegie Mellon research team unearthed algorithmic bias in online ads. When they simulated people searching for jobs online, Google ads showed listings for high-income jobs to men nearly six times as often as to equivalent women.
Most of the world's Internet-connected netizens know of Google through its wildly popular consumer-facing products like its search engine and YouTube video hosting platform. Yet, Google's parent company Alphabet also operates a fascinating "think/do tank" called Jigsaw (formerly Google Ideas) that asks "How can technology make the world safer?" Jigsaw is involved in an incredible array of projects from fighting hate speech with deep learning to making the world's constitutions searchable (a project I personally was heavily involved in, building the technology infrastructure that was used to acquire, digitize, version and codify thousands of constitutions and amendments dating back more than 200 years). Yet, one project of particular interest in today's world of botnet-enabled mass DDOS attacks on free speech and the evolution of cyberwarfare is Jigsaw's Project Shield, which offers free DDOS protection for news, human rights and elections monitoring websites, powered by Google's own global infrastructure. To most of us, distributed denial of service (DDOS) attacks are something we read about in the news periodically when one of our favorite websites goes down.
Now a small subsidiary of Google named Jigsaw is about to release an entirely new type of response: a set of tools called Conversation AI. Jigsaw is applying artificial intelligence to solve the very human problem of making people be nicer on the Internet. If it can find a path through that free-speech paradox, Jigsaw will have pulled off an unlikely coup: applying artificial intelligence to solve the very human problem of making people be nicer on the Internet. "Jigsaw recruits will hear stories about people being tortured for their passwords or of state-sponsored cyberbullying."