Emotion


Communication Re-Imagined with Emotion AI - ReadWrite

#artificialintelligence

There has long been a chasm between what we perceive artificial intelligence to be and what it can actually do. Our films, literature, and video game representations of "intelligent machines," depict AI as detached but highly intuitive interfaces. We will find communication re-imagined with emotion AI. As these artificial systems are being integrated into our commerce, entertainment, and logistics networks, we are witnessing emotional intelligence. These smarter systems have a better understanding of how humans feel and why they feel that way.


Realeyes raises $12.4 million to help brands detect emotion using AI on facial expressions

#artificialintelligence

Artificial emotional intelligence, or "emotion AI," is emerging as a key component of the broader AI movement. The general idea is this: It's all very well having machines that can understand and respond to natural-language questions, and even beat humans at games, but until they can decipher non-verbal cues such as vocal intonations, body language, and facial expressions, humans will always have the upper hand in understanding other humans. And it's against that backdrop that countless companies are working toward improving computer vision and voice analysis techniques, to help machines detect the intricate and finely balanced emotions of a flesh-and-bones homo sapiens. One of those companies is Realeyes, a company that helps big brands such as AT&T, Mars, Hershey's, and Coca-Cola gauge human emotions through desktop computers' and mobile devices' cameras. The London-based startup, which was founded in 2007, today announced a fresh $12.4 million round of funding from Draper Esprit, the VC arm of Japanese telecom giant NTT Docomo, Japanese VC fund Global Brain, Karma Ventures, and The Entrepreneurs Fund.


AI is coming: a short story of emotion recognition starring Game of Thrones Tooploox

#artificialintelligence

How difficult can it be to detect rectangles on an image? Well… it can be nearly impossible if there are no actual rectangles at all. Edges between a YouTube video's background and overlaid GoT scenes are visible only due to the optical illusion called illusory contours. In this task, we had to rely a little bit on the incredible vision of humans and make some annotations. Thankfully, it was enough to manually mark coordinates only on a single frame per YouTube video.


AI is coming: a short story of emotion recognition starring Game of Thrones Tooploox

#artificialintelligence

How difficult can it be to detect rectangles on an image? Well… it can be nearly impossible if there are no actual rectangles at all. Edges between a YouTube video's background and overlaid GoT scenes are visible only due to the optical illusion called illusory contours. In this task, we had to rely a little bit on the incredible vision of humans and make some annotations. Thankfully, it was enough to manually mark coordinates only on a single frame per YouTube video.


We Need Emotional Intelligence with our Edtech Learning & Technology News

#artificialintelligence

In a groundbreaking Stanford study, researchers Jason Okonofua, David Paunesku, and Gregory Walton have demonstrated through a series of three experiments that discipline problems are reduced when teachers think more emphatically about student misbehavior. On the other side of the spectrum, there is the punitive mindset, in which teachers are encouraged to punish students, oftentimes as a result of zero-tolerance policies in schools.


Amazon is building a voice-activated wearable that can 'read human emotions' and suggest products

Daily Mail - Science & tech

Amazon already knows a lot about its users, thanks to the plethora of data gathered from Alexa-equipped devices and the millions of purchases made on the e-commerce site. But soon, the tech giant's AI could be able to do more than just predict users' morning commute or notify them when they run out of toilet paper. Amazon is in the process of developing a voice-activated wearable device that can recognize human emotions using a variety of signals, Bloomberg reported. Soon, Amazon's AI could be able to do more than just predict users' morning commute. The device would be worn on a wrist and could be equipped with microphones and voice-detection software that allow it to interpret human emotions.


Amazon is reportedly working on an Alexa-powered wearable that reads human emotions

USATODAY - Tech Top Stories

Amazon Echo devices are compatible with a multitude of smart home products. Amazon is reportedly developing a voice-activated wearable device that can recognize human emotions. If successful, the health product could help the company improve its targeted advertisements and make better product recommendations, reports Bloomberg. The unnamed device could also advise humans on how to better interact with others. A source showed Bloomberg internal Amazon documents that revealed a few details about the futuristic health and wellness product.


Emotion Recognition in Conversation: Research Challenges, Datasets, and Recent Advances

arXiv.org Artificial Intelligence

Emotion is intrinsic to humans and consequently emotion understanding is a key part of human-like artificial intelligence (AI). Emotion recognition in conversation (ERC) is becoming increasingly popular as a new research frontier in natural language processing (NLP) due to its ability to mine opinions from the plethora of publicly available conversational data in platforms such as Facebook, Youtube, Reddit, Twitter, and others. Moreover, it has potential applications in health-care systems (as a tool for psychological analysis), education (understanding student frustration) and more. Additionally, ERC is also extremely important for generating emotion-aware dialogues that require an understanding of the user's emotions. Catering to these needs calls for effective and scalable conversational emotion-recognition algorithms. However, it is a strenuous problem to solve because of several research challenges. In this paper, we discuss these challenges and shed light on the recent research in this field. We also describe the drawbacks of these approaches and discuss the reasons why they fail to successfully overcome the research challenges in ERC.


Comparing Emotion Recognition Tech: Microsoft, Neurodata Lab, Amazon, Affectiva

#artificialintelligence

Automated emotion recognition has been with us for some time already. Ever since it entered the market, it has never stopped getting more accurate. Even tech giants joined the race and released their software for emotion recognition, after smaller startups had successfully done the same. We set out to compare the most known algorithms. Emotions are subjective and variable, so when it comes to accuracy in emotion recognition, the matters are not that self-evident.


The heart rate meme will help you express a staggering range of emotions

Mashable

It is probably safe to assume there is now a meme for every single human emotion. Here's one with a particularly broad scope: It's meant to be used for any scenario that makes your heart rate go up. Of course, this could be loads of things. Perhaps you're feeling anxious, or scared, or excited. Perhaps you are simply experiencing the natural physical effects of exercise.