Opinion: When It Comes to AI, Better Safe Than Quick

#artificialintelligence 

Some people got a good laugh when a chatbot was given its own Twitter account – and was then transformed into a Holocaust-denying racist in 24 hours and swiftly taken offline. The Microsoft chatbot – called Tay – is a piece of software which can communicate with others with no human involvement. This bot was equipped with artificial intelligence but was easily manipulated by Twitter users. Teaching a robot works just like drilling an innocent, unknowing child, as artificial intelligence learns from the sum of its experiences, much like human beings. The more often a subject is talked about, opinions expressed, and certain wordings used, the more likely the software is to consider it normal and use it.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found