Goto

Collaborating Authors

Tay tweets: Microsoft apologises for robot's racist and genocidal tweets

The Independent - Tech

Microsoft has apologised after a robot it made "tweeted wildly inappropriate and reprehensible words and images" that included support for Hitler and genocide. The company launched Tay, an artificially intelligent robot, on Twitter last week. It was intended to be a fun way of engaging people with AI – but instead was tricked by people into tweeting out support of Hitler and genocide, and repeated white power messages. Microsoft said that it had no way of knowing that people would attempt to trick the robot into tweeting the offensive words, but apologised for letting it do so. Microsoft said that it had launched Tay after success with a similar robot, XiaoIce, in China.


Elon Musk's openAI project says it is working on a robot to clean people's houses

The Independent - Tech

Elon Musk's 1 billion artificial intelligence group wants to build a robot to clean people's houses. OpenAI – which is funded by the billionaire maker of reusable rockets and electric cars – hopes to build a domestic robot as a test of its research into how to build artificial intelligence that won't kill us. Building such a robot isn't just a way of getting rid of household chores, according to a blog entry posted by the nonprofit research group. It would also be a neat way of testing whether or not its work in artificial intelligence is progressing in the right way. Boston Dynamics describes itself as'building dynamic robots and software for human simulation'.


Microsoft apologises for offensive tirade by its AI 'chatbot'

#artificialintelligence

Microsoft has said it is "deeply sorry" for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week, after the artificial intelligence program went on an embarrassing tirade. The bot, known as Tay, was targeted at 18 to 24-year-olds in the US and was designed to become "smarter" as more users interacted with it. Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program, forcing Microsoft Corp to shut it down. Following the setback, Microsoft said in a blog post it would revive Tay only if its engineers could find a way to prevent web users from influencing the chatbot in ways that undermine the company's principles and values. "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay," wrote Peter Lee, Microsoft's vice president of research.


Microsoft's own artificial intelligence says PS4 is better than Xbox One Games Geek.com

#artificialintelligence

Last week, Microsoft released an AI chatbot on Twitter with disastrous results. Though the TayTweets robot began life with innocent-enough tweets, it eventually started to spew hateful rhetoric about feminism, immigrants, and even claimed that the Holocaust didn't happen. It was a PR nightmare for Microsoft, and the tech giant eventually pulled the plug on their creation after 15 hours. Out of all the hateful things the chatbot said, there was one which was very amusing considering which company created it. The A.I. was asked whether Sony's PlayStation 4 or Microsoft's own Xbox One was the better system.


Artificial intelligence learns 'deep thoughts' by playing Pictionary

The Independent - Tech

Scientists are using the popular drawing game Pictionary to teach artificial intelligence common sense. AI researchers at the Allen Institute for Artificial Intelligence (AI2), a non-profit lab in Seattle, developed a version of the game called Iconary in order to teach its AllenAI artificial intelligence abstract concepts from pictures alone. Iconary was made public on 5 February in order to encourage people to play the game with AllenAI. By learning from humans, the researchers hope AllenAI will continue to develop common sense reasoning. "Iconary is one of the first times an AI system is paired in a collaborative game with a human player instead of antagonistically working against them," the Iconary website states.