What is It? How Can a Machine Exhibit It? "It's about thinking. The main theory is that emotions are nothing special. Each emotional state is a different style of thinking. So it's not a general theory of emotions, because the main idea is that each of the major emotions is quite different. They have different management organizations for how you are thinking you will proceed."
"Because the main point of the book [The Emotion Machine] is that it's trying to make theories of how thinking works. Our traditional idea is that there is something called 'thinking' and that it is contaminated, modulated or affected by emotions. What I am saying is that emotions aren't separate."
– Marvin Minsky. The Emotion Machine, book and draft, 2006.
No longer the realm of science fiction, computers can now recognize how we feel and analyze our real emotions - regardless of what we say. Seattle-based AI start-up SilverLogic Labs has developed an emotion recognition technology that analyzes and detects human emotions in ways that can predict how people will react or behave. Enterprise applications power the heart of business productivity, but they are traditionally difficult to implement, upgrade, and innovate. We look at how the next generation of enterprise apps could change the game. The company claims that this technology can predict emotions and human motivators "more accurately than any other technology available."
Many people get frustrated with technology when it malfunctions or is counterintuitive. The last thing people might expect is for that same technology to pick up on their emotions and engage with them differently as a result. All of that is now changing. Computers are increasingly able to figure out what we're feeling--and it's big business. A recent report predicts that the global affective computing market will grow from $12.2 billion in 2016 to $53.98 billion by 2021.
Octavia, a humanoid robot designed to fight fires on Navy ships, has mastered an impressive range of facial expressions. When she's turned off, she looks like a human-size doll. She has a smooth white face with a snub nose. Her plastic eyebrows sit evenly on her forehead like two little capsized canoes. When she's on, however, her eyelids fly open and she begins to display emotion.
By its very nature, virtual reality is an immersive medium. But for Rama Allen, that bar is higher. The interactive artist and Executive Creative Director at The Mill has made a name for himself leading inter-disciplinary teams of designers, filmmakers, coders, editors, engineers and VFX artists to create new kinds of cinematic experiences. At the inaugural Engadget Experience, a tech-art installation happening in LA next month, Allen will share some of his strangest creations, including a collaboration with an emotional AI; a VR experience that uses biometrics for levitation; a sculpting tool for the human voice; and a mixed-reality galactic journey to spread peace across the universe. Buy your tickets here, and hurry because discounted pricing ends next week, on October 27th.
Dog owners swear that their furry best friend is in tune with their emotions. Now it seems this feeling of interspecies connection is real: dogs can smell your emotional state, and adopt your emotions as their own. Science had already shown that dogs can see and hear the signs of human emotions, says Biagio D'Aniello of the University of Naples "Federico II", Italy. But nobody had studied whether dogs could pick up on olfactory cues from humans. "The role of the olfactory system has been largely underestimated, maybe because our own species is more focused on the visual system," says D'Aniello.
Until recently, I believed emotional intelligence would remain one of the core advantages of us humans after artificial intelligence takes over all tasks requiring memorization and logic. Over the past few years, I've focused my studies on emotionally intelligent algorithms, as it is the business of my startup, Inbot. While us humans continue to struggle to understand each other, emotionally intelligent AI has advanced rapidly. These algorithms already know what your desires, biases and emotional triggers are, based on your communication, friends and cultural context.
When the term "intelligence" comes up in regular conversation, most of us associate it with a person's capacity to acquire knowledge and new skills. But while IQ is useful, it's also clear that emotional intelligence (EQ) can be a difference maker in any professional role. Today's infographic comes from Aumann Bender & Associates, and it defines emotional intelligence while explaining the benefits of higher EQ in both qualitative and quantitative terms. In fact, emotional intelligence explains why 70% of the time, a person with an average IQ can actually outperform a person with more smarts.
In RIOT 2, an interactive film by Karen Palmer, controlling these emotions is the key to your escape. Yet the ongoing melding of games and film into interactive narratives raises the question of how we should control these new experiences naturally. "Conversation, facial expression, intonation of our voice, physical gesture -- all of those are the natural language of human interaction. "In my opinion, fear is the most powerful emotion," Palmer, originally from London, said.
She shared her bold vision of making machines "emotion-aware" through understanding human emotions, gestures, conversations, facial expressions and tone of voice. As we design AI products and services, we need to embed in them not just high IQ but also high EQ, because emotions are a big part of human life. She talked about building world's largest emotion repository to train and validate their emotion detection technology. Enterprises have lots of data, but very little amount of labeled data, and thus, enterprise AI is mostly about learning from small data.
Developed using the power of deep learning technology, the smart tech is capable of observing changes in tone, volume, speed, and voice quality and using this to recognize emotions like anger, laughter, and arousal in recorded speech. "The addition of Emotion AI for speech builds on Affectiva's existing emotion recognition technology for facial expressions, making us the first AI company to allow for a person's emotions to be measured across face and speech," Rana el Kaliouby, co-founder and CEO of Affectiva, told Digital Trends. We've set out to develop multi-modal Emotion AI that can detect emotion the way humans do from multiple communication channels. Affectiva developed its voice recognition system by collecting naturalistic speech data from a variety of sources, including commercially available databases.