Emotion


Why artificial intelligence is learning emotional intelligence

#artificialintelligence

I started transforming businesses with technology 35 years ago. It was as true then as it is now that the biggest risk we have to mitigate is the resistance of people and organizations to change. It is a well-known fact that three in four transformation programmes fail to achieve their intended goals because people are not prepared to adopt new processes and technology. Mitigating these risks and helping people learn new technology-enabled processes has been good for the consulting industry, and continues to be one of the keys to successful programmes. With artificial intelligence (AI), change management and process reengineering get reinvented.


Why artificial intelligence is learning emotional intelligence

#artificialintelligence

With artificial intelligence (AI), change management and process reengineering get reinvented. What was once a one-way street has become a two-way street: we can now teach technology to relate to people, as much as we train people to use technology. Going forward, getting this human-centric design right is the biggest factor in the success or failure of AI-driven transformations. THE HARDEST PART OF A DIGITAL TRANSFORMATION IS US, THE'ANALOG' HUMANS In traditional technology-led transformations, the goal of change management has been to teach people to use technology, dedicating leadership and resources to ensure compliance and adoption. The direction is one way: people learn how technology works in order to give it commands or interpret results.


A Simulated Emotional Expression Robot (SEER)

#artificialintelligence

"SEER" is a compact humanoid robot developed as results of deep research on gaze and human facial expression. The robot is able to focus the gaze directions on a certain point, without being fooled by the movement of the neck. As a result, the robot seems as if it has its own intentions in following and paying attention to its surrounding people and environment. Using a camera censor, whilst tracking eyes it has interactive gaze. In addition, by drawing the curve of the eyebrow using soft elastic wire, I were able to enrich the expression of the robot as if it lives with emotions.


Emotional Intelligence - Our Technology Needs It With Care.

#artificialintelligence

Many years ago when I had a hairstyle that needed a comb, my parents ran a manufacturing plant and a wholesale front end was integrated into the building. It followed the usual arc of 3 or so years of hit and miss ( i.e. losses to break-even) income, which was subsidised by my father still holding his job down at an engineering plant; he'd finish his work at 5pm, and go into the business until late at night. The older generation had a work ethic that defies what my definition of hard work is, because they must have worked about 70 hours a week as a norm, with 90 hours seeming more likely. This isn't an exaggeration, as I can remember as a boy that they'd be out there 7 days a week to get the business going. They left home from 7am and came home as late as 11pm sometimes, so the maths aren't unrealistic.


Goats like it best when you smile, new research shows

Mashable

You should be happy when you meet a hoofed friend. Especially if that friend is a goat. Because, it turns out, goats can read human moods and are more drawn to people who look happy. According to a new study recently published in the journal Open Science, goats prefer "positive human emotional facial expressions." That's smiling faces, to you and me.


Deep Emotion: A Computational Model of Emotion Using Deep Neural Networks

arXiv.org Artificial Intelligence

Emotions are very important for human intelligence. For example, emotions are closely related to the appraisal of the internal bodily state and external stimuli. This helps us to respond quickly to the environment. Another important perspective in human intelligence is the role of emotions in decision-making. Moreover, the social aspect of emotions is also very important. Therefore, if the mechanism of emotions were elucidated, we could advance toward the essential understanding of our natural intelligence. In this study, a model of emotions is proposed to elucidate the mechanism of emotions through the computational model. Furthermore, from the viewpoint of partner robots, the model of emotions may help us to build robots that can have empathy for humans. To understand and sympathize with people's feelings, the robots need to have their own emotions. This may allow robots to be accepted in human society. The proposed model is implemented using deep neural networks consisting of three modules, which interact with each other. Simulation results reveal that the proposed model exhibits reasonable behavior as the basic mechanism of emotion.


Emotion recognition using wireless signals

Communications of the ACM

This paper demonstrates a new technology that can infer a person's emotions from RF signals reflected off his body. EQ-Radio transmits an RF signal and analyzes its reflections off a person's body to recognize his emotional state (happy, sad, etc.). The key enabler underlying EQ-Radio is a new algorithm for extracting the individual heartbeats from the wireless signal at an accuracy comparable to on-body ECG monitors. The resulting beats are then used to compute emotion-dependent features which feed a machine-learning emotion classifier. We describe the design and implementation of EQ-Radio, and demonstrate through a user study that its emotion recognition accuracy is on par with state-of-the-art emotion recognition systems that require a person to be hooked to an ECG monitor. Emotion recognition is an emerging field that has attracted much interest from both the industry and the research community.13, If we can, such machines would enable smart homes that react to our moods and adjust the lighting or music accordingly. Movie makers would have better tools to evaluate user experience. Advertisers would learn customer reaction immediately. Computers would automatically detect symptoms of depression, anxiety, and bipolar disorder, allowing early response to such conditions. More broadly, machines would no longer be limited to explicit commands, and could interact with people in a manner more similar to how we interact with each other. Existing approaches for inferring a person's emotions either rely on audiovisual cues, such as images and audio clips,22, 42, 48 or require the person to wear physiological sensors like an Electrocardiogram (ECG) monitor.7,


AI systems dealing with human emotions: how the future will be like with emotional machines

#artificialintelligence

When I was a teenager, I used to enjoy watching a weekly television programme called "Lost in Space". In this science fiction series, a robot having a male identity was used as one of the main characters. He spoke perfect English but his speech sounded monotonic and dull – devoid of inflexions, and variations of volume and tone. This depiction of robots as sounding devoid of human emotions was not uncommon at that time – perhaps film producers felt the need to reinforce the differences between humans and machines. However, in a recent film called Her (released in 2014), the main character, named Theodore, uses an AI operating system that speaks conversational language.


AI is cool, but EI is still critical for top-performing HR service centers. - Neocase

#artificialintelligence

In an October 2017 interview with Knowledge@Wharton, Apoorv Saxena, lead product manager at Google and co-founder of the AI Frontiers conference said, "now computers are able to transcribe human speech better than humans." Did you read that right? Did it say that a computer can understand a human better than a human can? With all the hype around AI, it's easy to forget about another type of intelligence that will continue to be a critical component of service, including HR Service Delivery. That other form of intelligence is Emotional Intelligence (EI).


Emotion Recognition and Sentiment Analysis Market to Reach $3.8 Billion by 2025

#artificialintelligence

Significant advances have been made during the past few years in the ability of artificial intelligence (AI) systems to recognize and analyze human emotion and sentiment, owing in large part to accelerated access to data (primarily social media feeds and digital video), cheaper compute power, and evolving deep learning capabilities combined with natural language processing (NLP) and computer vision. According to a new report from Tractica, these trends are beginning to drive growth in the market for sentiment and emotion analysis software. Tractica forecasts that worldwide revenue from sentiment and emotion analysis software will increase from $123 million in 2017 to $3.8 billion by 2025. The market intelligence firm anticipates that this growth will be driven by several key industries including retail, advertising, business services, healthcare, and gaming. According to Tractica's analysis, the top use case categories for sentiment and emotion analysis will be as follows: "A better understanding of human emotion will help AI technology create more empathetic customer and healthcare experiences, drive our cars, enhance teaching methods, and figure out ways to build better products that meet our needs," says principal analyst Mark Beccue.