Amazon's voice assistant Alexa has become a hugely popular and growing business. In fact, David Limp, an Amazon senior vice president who oversees Alexa and all of its Amazon devices, says that Alexa is rapidly adding "skills," with more than 1,000 people working on it. On Tuesday, at Fortune's Brainstorm Tech conference, Limp spoke to Fortune's Adam Lashinsky about the inspiration for Alexa (hint: Think Star Trek) and the origin of the name to where the business is heading. Here is the lightly-edited transcript. Dave Limp: The device business is less about building hardware for customers and more about building services behind that hardware. So the original vision of Kindle was to deliver any book ever written in less than 60 seconds, and that was all about creating a cloud-based service that had a great catalogue of books, great selection, and great prices. And as we've rolled out devices since then, everything from Fire TV to, as you mentioned, Echo and Alexa and everything in between, it's about creating that backend service that constantly improves and adds value for customers, and isn't just a gadget but instead a full end to end service that can benefit what customers want.
The Mad Men era of advertising is long behind us, but some of its lessons are timeless: not least those imparted by David Ogilvy, the founder of Ogilvy & Mather, who was once described by Time magazine as "the most sought-after wizard in today's advertising industry". That was in 1962, but even today, 17 years after his death, Ogilvy remains one of advertising's most revered minds, universally acknowledged as the father of modern advertising and credited with pioneering a unique style of ad that didn't insult the intelligence of the individual. But what could today's advertising leaders, or indeed those just starting out on their advertising careers, learn from Ogilvy? The Drum decided to find out, teaming with IBM's Watson to analyse Ogilvy's myriad writings and talks to draw out insights and advice. A man of many words, Ogilvy became the authority on advertising in his day, penning a number of books on the subject and representing the industry in numerous TV and newspaper interviews.
It's a privilege to have Rama Akkiraju, IBM distinguished engineer and master inventor, participate as a Vision and Opportunity panelist at the 2016 Sentiment Analysis Symposium. I organize the symposium – this year's event takes place July 12 in New York – and recognize the many ways IBM has, over the years, expanded what's possible in the realm of what I'd characterize as "human data." "My team at IBM has been focused on developing technology to better understand people at a deeper level based on sentiment, emotion, attitude, and personality," said Rama. "With our work with Watson APIs – such as Tone Analyzer, Personality Insights, Emotion Analysis, and Sentiment Analysis – we're working to enable more compassion, engagement, and personalization in conversations across various channels." IBM's Marie Wallace, a 2014 sentiment symposium speaker, relates in a blog article that she "joined IBM in 2001 to build the next generation of NLP technology for IBM… the 3rd generation of IBM LanguageWare, which initially started back in the '80s." And I wrote, myself, in a 2008 InformationWeek article, BI at 50 Turns Back to the Future, about 1950s work by IBM researcher Hans Peter Luhn on the creation of business intelligence via text analysis.
A Tokyo-based nonprofit organization will begin offering online Japanese-language classes this month to children from abroad who need help to keep up in class at Japanese elementary and junior high schools. Youth Support Center's YSC Global School in Fussa, western Tokyo, is set to offer instruction provided by language education experts via personal computers or tablets to young foreign nationals living anywhere in Japan. The NPO will cooperate with municipalities and schools without sufficient resources to teach Japanese to such children. Yuran Nakajima, 16, watched a lecture on a PC monitor at YSC's office in Fussa during a trial session in September. Three other students sat in the classroom elsewhere in the city, where the lesson was being taught.
Computational linguists and computer scientists, among them University of Texas professor Jason Baldridge, have been working for over fifty years toward algorithmic understanding of human language. They are, however, doing a pretty good job with important tasks such as entity recognition, relation extraction, topic modeling, and summarization. These tasks are accomplished via natural language processing (NLP) technologies, implementing linguistic, statistical, and machine learning methods. Voice response and personal assistants -- Siri, Google Now, Microsoft Cortana, Amazon Alexa -- rely on NLP to interpret requests and formulate appropriate responses. Search and recommendation engines apply NLP, as do applications ranging from pharmaceutical drug discovery to national security counter-terrorism systems.