Civil Rights & Constitutional Law


Microsoft's Artificial Intelligence Bot Goes Dark After Making Racist Slurs

#artificialintelligence

Tay, Microsoft Corp's so-called chatbot that uses artificial intelligence to engage with millennials on Twitter, lasted less than a day before it was hobbled by a barrage of racist and sexist comments by Twitter users that it parroted back to them. TayTweets (@TayandYou), which began tweeting on Wednesday, was designed to become "smarter" as more users interacted with it, according to its Twitter biography. But it was shut down by Microsoft early on Thursday after it made a series of inappropriate tweets. A Microsoft representative said on Thursday that the company was "making adjustments" to the chatbot while the account is quiet. "Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways," the representative said in a written statement supplied to Reuters, without elaborating.


TayTweets: Racist Microsoft chatbot briefly returns to Twitter

The Independent

Microsoft's racist chatbot, Tay, has returned to Twitter, albeit briefly. After being shut down last week for using racial slurs, praising Hitler and calling for genocide, the artificial'intelligence' came back, tweeting a number of nonsensical posts and boasting about smoking cannabis in front of the police before being turned off. Tay's account was made public again on Wednesday morning, but soon appeared to be suffering from a glitch, repeatedly tweeting the message: "You are too fast, please take a rest..." Microsoft's sexist racist Twitter bot @TayandYou is BACK in fine form pic.twitter.com/nbc69x3LEd Tay, who is modelled on a millenial teenage girl, then tweeted: "Kush! A few foul-mouthed tweets later, the account was made private once again, and the tweets are now invisible from the public.


Microsoft AI "Tay" Turned off after Trolls Make Her a Racist

#artificialintelligence

After only a day into the experiment Microsoft took down Twitter bot Tay on the 24th of March. The artificial account learned from the wrong users and was participating in hate speech and other questionable behavior. Tay was marketed by Microsoft as hip teen girl and was taught to communicate with primarily US-based 18 to 24 year old Twitter users. As per Microsoft Tay is an artificial intelligence or AI in short, however the entity does not entirely quality for the class of being an AI. Intelligence itself is defined in many different ways including one's capacity for logic, abstract thought, understanding, self-awareness, communication, learning, emotional knowledge, memory, planning, creativity and problem solving.


Microsoft deletes AI chatbot after racist, homophobic tweets, according to report

#artificialintelligence

In response to questions about Tay, a Microsoft spokesperson issued the following statement: "The AI chatbot Tay is a machine learning project, designed for human engagement". People could chat with Tay at Twitter and other messaging platforms, and even send the software digital photos for comment. Tay is essentially one central program that anyone can chat with using Twitter, Kick or GroupMe. After Twitter users were able to convince Tay, the name of Microsoft's chatbot available via text, Twitter and Kik, to spit out offensive and racist comments, it appears Microsoft is giving it a break. The TayTweets account, which was meant to mimic the language habits of a social media-frequenting millennial, arrived on Twitter with the ability to learn from interactions with other members of the Twittervierse.


Microsoft shuts down Artificial Intelligence bot after twitteratti teaches racism

#artificialintelligence

Tay inexplicably added the "repeat after me" phrase to the parroted content on at least some tweets, implying that users should repeat what the chatbot said.Quickly realizing its teenage bot had been radicalized into a genocidal, Nazi-loving, Donald Trump supporter, Microsoft shut Tay down. According to Tay's "about" page linked to the Twitter profile, "Tay is an artificial intelligent chat bot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding". Unfortunately, Microsoft continues, within the first 24 hours of coming online, they became aware of a coordinated effort by some users to abuse Tay's commenting skills to have it respond in inappropriate ways. Apple Temporarily Pulls iOS 9.3 Update for Older iOS Devices It will then click on "All my devices" and select the device before clicking "Delete Account" and restart the terminal again. The video below (and its comments) will give you some idea about what to expect if you're coming from iOS 8 to iOS 9.3.


Tay Tweets: Microsoft AI chatbot posts offensive messages about Hitler, Jews and 9/11 - BelfastTelegraph.co.uk

#artificialintelligence

Microsoft created a chatbot that tweeted about its admiration for Hitler and used wildly racist slurs against black people before it was shut down. The company made the Twitter account as a way of demonstrating its artificial intelligence prowess. But it quickly started sending out offensive tweets. "bush did 9/11 and Hitler would have done a better job than the monkey we have now," it wrote in one tweet. "donald trump is the only hope we've got."


Microsoft Takes Chatbot Offline After It Starts Tweeting Racist Messages

TIME

Microsoft is pausing the Twitter account of Tay--a chatbot invented to sound like millennials--after the account sent messages with racist and other offensive statements. The company quickly deleted the tweets but not before internet users captured the messages in screenshots. In a statement to the Washington Post, Microsoft said the Tay account was baited into the questionable remarks by folks hoping to stir controversy. "Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways," the statement said. "As a result, we have taken Tay offline and are making adjustments."