What is It? How Can a Machine Exhibit It? "It's about thinking. The main theory is that emotions are nothing special. Each emotional state is a different style of thinking. So it's not a general theory of emotions, because the main idea is that each of the major emotions is quite different. They have different management organizations for how you are thinking you will proceed."
"Because the main point of the book [The Emotion Machine] is that it's trying to make theories of how thinking works. Our traditional idea is that there is something called 'thinking' and that it is contaminated, modulated or affected by emotions. What I am saying is that emotions aren't separate."
– Marvin Minsky. The Emotion Machine, book and draft, 2006.
Founded in 2012, Israeli startup Beyond Verbal has taken in $10.1 million in 4 rounds of funding to develop a technology that "analyzes emotions from vocal intonations". Like CrowdEmotion, nViso's technology tracks the movement of 43 facial muscles using a simple webcam and then uses AI to interpret your emotions. The Company uses a branch of artificial intelligence called Natural Language Processing (NLP) techniques to capture people's emotions, social concerns, thinking styles, psychology, and even their use of parts of speech. The startup developed a technique to "read" human emotional state called Transdermal Optical Imaging (TOI) using a conventional video camera to extract information from the blood flow underneath the human face.
As the co-founder and CEO of Affectiva, el Kaliouby is on a mission to expand what we mean by "artificial intelligence" and create intelligent machines that understand our emotions. The new AI category el Kaliouby and her team at Affectiva are spearheading is "Emotion AI," defining a new market by pursuing two goals: Allowing machines to adapt to human emotions in real-time and providing insights and analytics so organizations can understand how people engage emotionally in the digital world. Then she read Picard's Affective Computing, published in 1997, and became "super-fascinated by the idea that a computer can read people's emotions. For her dissertation, el Kaliouby used the autism research center's data to train a computer model to recognize accurately and in real-time complex mental states with "an accuracy and speed that are comparable to that of human recognition."
California nonprofit Worksafe, a worker safety advocacy group, recently made headlines when it reported that the injury rate at Tesla's Fremont, California, plant was more than 30 percent higher than the industry average in 2014 and 2015. A recent email Musk sent to employees indicates just how seriously he's taking the issue. Emotional intelligence, the ability to make emotions work for you instead of against you, is an essential quality of effective leaders. To personally meet every injured employee and actually learn how to perform the task that caused that person's injury is remarkable for the CEO of any company.
With the goal of making human thought and decision making a mechanical process, algorithms and networks have grown to form the basis of what is now known as artificial intelligence (AI). One of the major goals of AI is getting a computer to understand and subsequently communicate in natural languages, a field called natural language processing (NLP). Major tech giants are investing in the space and acquiring AI companies as well. A robot called Kismet, from MIT's Artificial Intelligence Lab, can interact by recognising human body language and tone of voice and reposting according to that input.
Among them: the HomePod smart speaker (see "Apple Is Countering Amazon and Google with a Siri-Enabled Speaker") and new ways developers can build artificial intelligence into apps. Picard, an expert in using wearable devices and phone data to measure human emotions, is researching how data pulled from cell phones might help identify and perhaps even predict depression, a problem expected to be the second leading cause of disability in the world by 2020. Picard's lab doesn't use Apple phones for its research today, though she tells Cook she would like to. For their current study of student emotional health, her team can't get certain data from an iPhone that they need.
What are the natural extensions of machine learning (ML) and deep learning as well as natural language processing (NLP) and affective computing (AC)? To extract insights from these unstructured sources of data, NLP uses ML and AI to understand unstructured text and provide context to language. Sentiment thus plays a very important role in decision making and the ability of a machine to convert human language into machine readable code and convert it into actionable insights is the capability offered by NLP. Rather than have a client helpline or a customer service helpline, chatbots offer a cheaper and easier mode of communication.
Its software development kit (SDK) and cloud-based API allow developers to enrich digital experiences by adding "emotion awareness" to apps from games to medical devices. To address these requirements, Affectiva has been partnering with robotics, AI, and marketing companies to augment its 30-person staff. Currently, Affectiva's emotion-recognition products can determine 20 facial expressions, seven emotional categories as well as age, gender, and ethnicity with over 90% accuracy. Certainly, as the pace of AI accelerates and social/camera apps like Instagram and Snapchat proliferate, El Kaliouby expects emotional AI to become ubiquitous in the next three years as well.
He's won an Academy Award for constructing lifelike animated faces for movies like King Kong and Spider-Man 2, work that began at Sagar's Laboratory for Animate Technologies to create human movement designed not by actual human movement but by neural networks. Potential applications of its tech are numerous, but Soul Machines decided to create an AI baby because babies are natural "learning machines" and as a way to explore the field of social learning, because the company wants to train AI the same way humans raise children. Responses to human emotion are also part of Soul Machines avatars, which can respond to human emotion it sees through cameras that track facial expression. In case you needed things to get even more futuristic or sci-fi, Sagar said in the future he may consider combining affective computing, avatars, and AI designed to mimic the tone, style, and word usage of people both alive and dead.
Healthcare AI expert Peter Borden, managing director at consulting and services firm Sapient Health, helps healthcare organizations apply innovative AI technologies to their ecosystems. In this Q&A with SearchHealthIT, Borden talks about how such AI in healthcare applications helps with clinical trials, customizing post-discharge instructions using patients' personal characteristics and population health. How will new forms of AI in healthcare affect transitional care when patients leave the hospital for other settings? How could emotional intelligence help AI in healthcare applications?
Recently, I red an article in which Ben Hubl applied Microsoft Cognitive Service Emotions API to do emotions analysis of video of Hillary and Trumps last debate. Microsoft Cognitive Services Emotions API gives percentages of what it thinks the emotion is based on the picture. I used Python to get the results from Microsoft Cognitive Video Emotions API, and then used Python's matplotlib and R's ggplot 2 to make visualizations. The remaining emotions like fear, contempt and surprise remained low during the whole speech.