Collaborating Authors


Hume AI is Teaching AI to Understand and Empathize with Human Emotion's


On this podcast Jason Stoughton is joined by Alan Cowen, CEO and Chief Scientist, at Hume AI. As AI progresses, both in terms of what it can do and how widely it is deployed, the ability for AI to understand and empathize with our emotions is still a glaring hole in AI's capabilities. On this podcast Jason and Alan talk about the state of the technology, unpack the hopes and dreams and fears of an AI that understands, and can potentially manipulate, our emotions and how Hume is not only leading the way in advancing AI's capabilities in this area but is also leading the way in ensuring that AI should service human well being above all else.

Can AI Be Emotionally Intelligent?


Artificial intelligence (AI) machine learning has transformed speech and language recognition technology. A new study published in IEEE Transactions on Affective Computing by researchers affiliated with the Japan Advanced Institute of Science and Technology (JAIST) and Osaka University demonstrates human-like, sentiment-sensing AI machine learning using physiological data. Emotional intelligence, or emotional quotient (EQ), refers to a person's ability to understand and manage emotions in order to build relationships, solve conflicts, manage stress, and other activities. Applied artificial intelligence machine learning practitioners are striving to integrate more human-like traits, such as EQ, in areas such as conversational AI chatbots, virtual assistants, and more for customer service, sales, and other functions. According to Allied Market Research, the worldwide conversational AI market size is projected to reach $32.6 billion by 2030, with a compound annual growth rate of 20 percent during 2021-2030.

The Perils Of AI Emotion Recognition - AI Summary


New AI tools purport to be able to identify human emotion in images and speech patterns. How it works: Emotion recognition software is meant to do just that -- use decades-old psychological research about how humans express emotions and recognize it in image, video or even in speech. A multidisciplinary team led by University of Cambridge professor Alexa Hagerty recently produced the Emojify Project, which allows users on the web to try out emotion recognition tech for themselves. What they're saying: In a piece published earlier this week in Nature, AI ethicist Kate Crawford argued the technology should be regulated because it can draw "faulty assumptions about internal states and capabilities from external appearances, with the aim of extracting more about a person than they choose to reveal." Last week my Axios colleague Ina Fried broke a story about a digital civil rights group asking Spotify to abandon a technology it has patented to detect emotion, gender and age using speech recognition.

Companies are using AI to monitor your mood during sales calls. Zoom might be next.


Virtual sales meetings have made it tougher than ever for salespeople to read the room. So, some well funded tech providers are stepping in with a bold sales pitch of their own: that AI can not only help sellers communicate better, but detect the "emotional state" of a deal -- and the people they're selling to. In fact, while AI researchers have attempted to instill human emotion into otherwise cold and calculating robotic machines for decades, sales and customer service software companies including Uniphore and Sybill are building products that use AI in an attempt to help humans understand and respond to human emotion. Virtual meeting powerhouse Zoom also plans to provide similar features in the future. "It's very hard to build rapport in a relationship in that type of environment," said Tim Harris, director of Product Marketing at Uniphore, regarding virtual meetings.

Emotion Recognition From Speech


The understanding of emotions from voice by a human brain are normal instincts of human beings, but automating the process of emotion recognition from speech without referring any language or linguistic information remains an uphill grind. In the research work presented based on the input speech, I am trying to predict one of the six types of emotions (sad, neutral, happy, fear, angry, disgust). The diagram given below explain how emotion recognition from speech works. The audio features are extracted from input speech, then those features are passed to the emotion recognition model which predicts one of the six emotions for the given input speech. Most of the smart devices or voice assistants or robots present in the world are not smart enough to understand the emotions.

Affective Computing: AI with Fear, Thought as the Means


It is not that the fear circuit in the brain is not known, but what has to get there to determine fear? If something is about to be lost, broken or damaged, or there is some imminent danger and fear floods in, how is the external able to influence the brain? Senses take in external stimuli, but are integrated into a uniform unit that goes to the memory. It is from the memory that the unit goes to the destination for feeling effect. It is when this unit goes to a spot in that destination that fear happens.

Artificial emotional intelligence could change senior users' perceptions of social robots


Socially assistive robots (SARS) are a class of robotic systems specifically designed to help vulnerable or older users to complete everyday activities. In addition to increasing their independence, these robots could stimulate users mentally and offer basic emotional support. To support users most effectively, however, these robots should be able to engage in meaningful social interactions, identifying the emotions of users and responding appropriately to them. This could ultimately increase the users' trust in the robots, while also promoting their emotional wellbeing. Researchers at University of Denver, DreamFace Technologies, and University of Colorado have recently carried out a small pilot study aimed at exploring how the perceptions of older adults using socially assistive robots change depending on whether these robots have an artificial emotional intelligence or not.

Your essential guide to improving emotional intelligence at work


Have you ever wondered what kinds of habits and behaviors might benefit your work life? Is there a quality common amongst inspiring and helpful team members and leaders? Emotional intelligence is a way of keeping yourself in check and cultivating mindfulness that you can apply to your work environment. The following page will teach how to develop emotional intelligence and explain how it can help you in everyday situations. Originally coined in 1990 by psychology professors John D. Mayer and Peter Salovey, the term "emotional intelligence" has entered the public vocabulary in recent years.

Good Old-Fashioned Artificial Intelligence and other weird and wonderful AI trivia


The first ultraintelligent machine is the last invention that man needs ever make, provided that the machine is docile enough to tell us how to keep it under control, said Oxford philosopher Nick Bostrom. His book, Superintelligence, is a crystal ball on AI's timeline and the future of humanity. Inarguably, artificial intelligence has become an integral part of our lives. Here, we look at the AI breakthroughs that precipitated this paradigm shift. In 1956, John McCarthy, one of the founding fathers of AI, coined the term "artificial intelligence" during the Dartmouth workshop in 1956.

AI-based artistic representation of emotions from EEG signals: a discussion on fairness, inclusion, and aesthetics Artificial Intelligence

While Artificial Intelligence (AI) technologies are being progressively developed, artists and researchers are investigating their role in artistic practices. In this work, we present an AI-based Brain-Computer Interface (BCI) in which humans and machines interact to express feelings artistically. This system and its production of images give opportunities to reflect on the complexities and range of human emotions and their expressions. In this discussion, we seek to understand the dynamics of this interaction to reach better co-existence in fairness, inclusion, and aesthetics.