The fraudulent claims made by IBM about Watson and AI. They are not doing "cognitive computing" no matter how many time they say they are.

#artificialintelligence

I was chatting with an old friend yesterday and he reminded me of a conversation we had nearly 50 years ago. I tried to explain to him what I did for living and he was trying to understand why getting computers to understand was more complicated than key word analysis. I explained about concepts underlying sentences and explained that sentences used words but that people really didn't use words in their minds except to get to the underlying ideas and that computers were having a hard time with that. Fifty years later, key words are still dominating the thoughts of people who try to get computers to deal with language. But, this time, the key word people have deceived the general public by making claims that this is thinking, that AI is here, and that, by the way we should be very afraid, or very excited, I forget which.


Emotion Recognition and Sentiment Analysis Market to Reach $3.8 Billion by 2025

#artificialintelligence

Significant advances have been made during the past few years in the ability of artificial intelligence (AI) systems to recognize and analyze human emotion and sentiment, owing in large part to accelerated access to data (primarily social media feeds and digital video), cheaper compute power, and evolving deep learning capabilities combined with natural language processing (NLP) and computer vision. According to a new report from Tractica, these trends are beginning to drive growth in the market for sentiment and emotion analysis software. Tractica forecasts that worldwide revenue from sentiment and emotion analysis software will increase from $123 million in 2017 to $3.8 billion by 2025. The market intelligence firm anticipates that this growth will be driven by several key industries including retail, advertising, business services, healthcare, and gaming. According to Tractica's analysis, the top use case categories for sentiment and emotion analysis will be as follows: "A better understanding of human emotion will help AI technology create more empathetic customer and healthcare experiences, drive our cars, enhance teaching methods, and figure out ways to build better products that meet our needs," says principal analyst Mark Beccue.


A man famous in China for his superhuman memory beat an AI in a facial recognition contest

#artificialintelligence

At a competition in China to see who is better at recognizing faces, man or machine, Wang Yuheng, representing the humans, emerged victorious. Wang is famous in China for his photographic memory. He successfully identified a specific glass of water out of 520 seemingly identical ones in a Chinese reality TV show. He also reportedly helped police crack a case by extracting "hidden clues" from surveillance camera footage, thanks to his exceptional observational skills. But Wang's competitor, "Mark," is no less smart.


Affect-Driven Generation of Expressive Musical Performances

AAAI Conferences

These theories of musical perception and musical understanding are the basis of the computational model of musical knowledge of the system. SaxEx is implemented in Noos (Arcos Plaza 1997; 1996), a reflective object-centered representation language designed to support knowledge modeling of problem solving and learning. In our previous work on SaxEx (Areos, L6pez de M ntaras, Serra 1998) we had not taken into account the possibility of exploiting the affective aspects of music to guide the retrieval step of the CBR process. In this paper, we discuss the introduction of labels of affective nature (such as "calm", "tender", "aggressive", etc.) as a declarative bias in the Identify and Search subtasks of the Retrieval task (see Figure 2). Background In this section, we briefly present some of the elements underlying SaxEx which are necessary to understand the system.