Goto

Collaborating Authors

How ClickHole Crafts the Web's Most Hilarious Adventure Games

WIRED

I was dueling Anthony Bourdain to decide which one of us was more human. I had arrived at this moment via a surreal and silly journey that began with a question: "Can you pass the Turing Test?" I'd found this rabbit hole on ClickHole, the Buzzfeed-parodying offshoot of The Onion that has, however improbably, become a tiny haven for hilarious, often surprisingly complex Choose Your Own Adventure style interactive fiction games. Clickventures, as they're called, are exercises in absurdist escalation. They typically begin modestly, but quickly shift into the unexpected and ridiculous. To pass the Turing Test, I journeyed from a home computer office to an ersatz version of a Pokémon gym on the world stage.


The Newfound Popularity of Sci-Fi Books Has a Dark Side

WIRED

Most fantasy and science fiction books are published by houses--Tor, DAW, Orbit, etc.--that specialize in genre titles. Larger publishers have tended to shun the genre, especially when it comes to particular subgenres like space opera. But Bruce Nichols, a senior vice president and publisher at Houghton Mifflin Harcourt, says that's changing fast. "It's no longer the case that the world is split between a sort of pulp ghetto and the literary world," Nichols says in Episode 195 of the Geek's Guide to the Galaxy podcast. "The entire genre has gone so mainstream, and some absolutely terrific writers are contributing to it, more than ever before."


Is Python or Perl faster than R?

@machinelearnbot

This is not the silly question about which language is best, of course it depends on what kind of applications you work on, your client / company, historical reasons, and your expertize. Most of us use a combination of multiple languages anyway. Though a lot of statistical / machine learning algorithms are now being implemented in Python - see Python and R articles - and it seems that Python is more appropriate for production code and big data flowing in real time, while R is often used for EDA - exporatory data analysis - in manual mode. My question is, if you make a true apple-to-apple comparison, what kind of computations does Python perform much faster than R, (or the other way around) depending on data size / memory size? Is Python better suited for Hadoop?


Microsoft Apologizes For Chatbot Tay's Holocaust Denying, Racist And Anti-Feminism Tweets

International Business Times

Microsoft Corp. Friday issued an apology after its artificial-intelligence chatbot Tay posted tweets, denying Holocaust and announcing feminists should "burn in hell" among many other racist posts. The company, however, said that the "coordinated attack by a subset of people exploited a vulnerability" in the chatbot that was launched Wednesday. "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay. Tay is now offline and we'll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values," Peter Lee, Microsoft's vice president of research, said on the company's official blog. Microsoft introduced Tay as the chatbot designed to engage and entertain people through "casual and playful" conversation online.


Data Scientists Love Jobs, Dislike What They Do Most: Clean Data -- ADTmag

#artificialintelligence

Paradoxically, data scientists love their jobs overall but dislike what they do most: cleaning and organizing data. That's one of the main takeaways from a new report by CrowdFlower Inc. on what has been called the "sexiest job of the 21st century." "Organizations that start prioritizing ways to help data scientists clean their data are going to find a data team with more time to work on more important -- and more fulfilling -- tasks," said CrowdFlower's Justin Tenuto in a blog post this week announcing the new "2016 Data Science Report" (available as a free PDF upon providing registration information). The report was compiled early this year from surveys, interviews and in-house analytics of CrowdFlower's own platform, which, conveniently, provides a contributor network to help organizations, "collect, clean and label data." In its survey, CrowdFlower found almost the same percentage of respondents reported they spent most of their time cleaning data (60 percent) as those who reported that task to be the least enjoyable part of their job (57 percent).


Top 10 Data Science Resources on Github

#artificialintelligence

In our latest inspection of Github repositories, we focus on "data science" projects. Unlike other searches we have performed over the past several months, nearly all of the repositories which show up (listed by number of stars* in descending order) are resources for learning data science, as opposed to tools for doing. As such, this is much less a software listing than it is a collection of tutorials and educational resources. There are, however, a few software surprises in here as well, such as a data science-oriented IDE and a great notebook-related project. We include, however, the standard informational notification we have placed on our previous Github Top 10 lists: open source tools have been used by 73% of data scientists in the past 12 months, according to a recent KDnuggets survey (and accounting for the 12 months prior to the survey).


Microsoft 'deeply sorry' for chat bot's racist tweets

#artificialintelligence

The company launched the bot as an experiment in AI on Wednesday, and in less than a day, it began to tweet things like "Hitler was right I hate the jews" and "I f------ hate feminists and they should all die and burn in hell." Tay is essentially one central program that anyone can chat with using Twitter, Kik or Groupme. As people talk to it, the bot picks up new language and learns to respond in new ways. But Tay also had a "vulnerability" that online trolls picked up on pretty quickly. By telling the bot to "repeat after me," Tay would retweet anything that someone said.


word2vec, LDA, and introducing a new hybrid algorithm: lda2vec

#artificialintelligence

Standard natural language processing (NLP) is a messy and difficult affair. It requires teaching a computer about English-specific word ambiguities as well as the hierarchical, sparse nature of words in sentences. At Stitch Fix, word vectors help computers learn from the raw text in customer notes. Our systems need to identify a medical professional when she writes that she'used to wear scrubs to work', and distill'taking a trip' into a Fix for vacation clothing. Applied appropriately, word vectors are dramatically more meaningful and more flexible than current techniques and let computers peer into text in a fundamentally new way.


Microsoft apologizes for 'offensive and hurtful tweets' from its AI bot

#artificialintelligence

Microsoft today published an apology for its Twitter chatbot Tay, saying in a blog post that a subset of human users exploited a flaw in the program to transform it into a hate speech-spewing Hitler apologist. Author Peter Lee, the corporate vice president of Microsoft Research, does not explain in detail what this vulnerability was, but it's generally believed that the message board 4chan's notorious /pol/ community misused Tay's "repeat after me" function. So when Tay was fed sexist, racist, and other awful lines on Twitter, the bot began to parrot those vile utterances and, later, began to adopt anti-feminist and pro-Nazi stances. Microsoft pulled the plug on Tay after less than 24 hours. Lee says Tay is the second chatbot it's released into the wild, the first being the Chinese messaging software XiaoIce, an AI now used by around 40 million people.


Investing In Artificial Intelligence

#artificialintelligence

Artificial intelligence is one of the most exciting and transformative opportunities of our time. From my vantage point as a venture investor at Playfair Capital, where I focus on investing and building community around AI, I see this as a great time for investors to help build companies in this space. There are three key reasons. First, with 40 percent of the world's population now online, and more than 2 billion smartphones being used with increasing addiction every day (KPCB), we're creating data assets, the raw material for AI, that describe our behaviors, interests, knowledge, connections and activities at a level of granularity that has never existed. Second, the costs of compute and storage are both plummeting by orders of magnitude, while the computational capacity of today's processors is growing, making AI applications possible and affordable.