Robots take baby steps toward emotion and empathy at CES

The Japan Times

LAS VEGAS – The robot called Forpheus does more than play a mean game of table tennis. It can read body language to gauge its opponent's ability, and offer advice and encouragement. "It will try to understand your mood and your playing ability and predict a bit about your next shot," said Keith Kersten of Japan-based Omron Automation, which developed the Forpheus to showcase its technology. "We don't sell ping pong robots but we are using Forpheus to show how technology works with people," said Kersten. Forpheus is among several devices shown at this week's Consumer Electronics Show that highlight how robots can become more like humans by acquiring "emotional intelligence" and empathy.

Inspiring Leadership through Emotional Intelligence Coursera


I have never regretted enrolling in the Inspiring Leadership through Emotional Intelligence course. It has indeed been a course that has provided me with new knowledge, ideas, and a broader perspective relating to;life in general. How could I be in a position to understand emotional, social and cognitive intelligence and their applicability in my personal life, work, and relationship? Not to mention dealing with chronic stress as a leader and the need for renewal. Professor Boyatzis is such an intelligent professor.

Why you need emotional intelligence


Emotional intelligence taps into a fundamental element of human behavior that is distinct from your intellect. There is no known connection between IQ and emotional intelligence; you simply can't predict emotional intelligence based on how smart someone is. Intelligence is your ability to learn, and it's the same at age 15 as it is at age 50. Emotional intelligence, on the other hand, is a flexible set of skills that can be acquired and improved with practice. Although some people are naturally more emotionally intelligent than others, you can develop high emotional intelligence even if you aren't born with it.

10 ways Artificial Intelligence can change the way you use smartphones


"Digital Me" Sitting on the Device Smartphones will be an extension of the user, capable of recognizing them and predicting their next move. They will understand who you are, what you want, when you want it, how you want it done and execute tasks upon your authority. User Authentication Password-based, simple authentication is becoming too complex and less effective, resulting in weak security, poor user experience, and a high cost of ownership. Security technology combined with machine learning, biometrics and user behavior will improve usability and self-service capabilities. Emotion Recognition Emotion sensing systems and affective computing allow smartphones to detect, analyze, process and respond to people's emotional states and moods.

Response to Sloman's Review of Affective Computing

AI Magazine

Sloman was one of the first in the AI community to write about the role of emotion in computing (Sloman and Croucher 1981), and I value his insight into theories of emotional and intelligent systems. Alas, Sloman's review dwells largely on some details related to unknown features of human emotion; hence, I don't think the review captures the flavor of the book. However, he does raise interesting points, as well as potential misunderstandings, both of which I am grateful for the opportunity to comment on. Sloman writes that I "welcome emotion detectors in a wide range of contexts and relationships, for example, teacher and pupil." This might sound innocuous, but its presumption of the existence of emotion detectors is not.

Ubiquitous Computing and Sensing

AI Magazine

Some experts are likely to be fiercely critical because of omissions or errors. Others with tunnel vision are likely to miss the point. Rosalind Picard, with considerable courage, addresses a broad collection of themes, including the nature of motivation, emotions, and feeling; the detection of emotional and other affective states and processes; the nature of intelligence and the relationships between intelligence and emotions; the physiology of the brain and other aspects of human physiology relevant to affective states; requirements for effective human-computer interfaces in a wide range of situations; wearable devices with a range of sensing and communication functions; philosophical and ethical issues relating to computers of the future; and a brief encounter with theology. This is a book with a bold vision. Some readers will find it inspiring and mind stretching.

Report on the 2013 Affective Computing and Intelligent Interaction Conference (ACII 2013)

AI Magazine

Under the auspices of the Humaine Association (now called the Association for the Advancement of Affective Computing, AAAC), the ACII conference series has become an important international forum for research on affective human-machine interaction and intelligent affective systems. Affect is a phenomenon of substantial importance in most if not all of human activities. This ACII conference therefore strived to emphasize the humanistic side of affective computing by promoting research at the crossroads between engineering and human sciences, including biological, social, and cultural aspects of human life. This has been exemplified by conference topics as varied as computerized psychological emotional modeling; art and cinema studies; gaming; learning; depression, stress, and anxiety management; robots, avatars, and virtual worlds; social media analysis; pattern recognition, classification, and data mining; real-time and embedded affective systems; and others. All have in common affect and emotions, with an emphasis on a computational view of emotion.

Leadership and Emotional Intelligence Coursera


Organizations are teams of teams. By definition, a manager gets work done not only through one's own resources and efforts, but also through others. In other words, you are required to work effectively with people outside your team. These are individuals and groups within the organization and also outside. You have to influence people at different levels and functions, build collaborative relationships wherever possible, negotiate wisely, handle difficult conversations and make decisions in the face of uncertainty and complexity.

AI And The Future Of Customer Care Articles Big Data


Robert Weideman, Executive Vice President and General Manager of Nuance Enterprise, notes that, 'When you think of conversational AI, you need to think of a person. Literally, we're trying to mimic a human agent. Consumers also expect conversations they have to flow from one channel to another – so they don't have to backtrack or repeat themselves.' Understanding emotions is key to this, and the technology is not light years away. Futurist Ray Kurzweil, a leading AI scientist, said in an interview with Wired that a machine understands that kind of complex natural language, it becomes, in effect, conscious, and said that he believes this moment to be in just 2029, when machines will have full'emotional intelligence, being funny, getting the joke, being sexy, being loving, understanding human emotion. That's actually the most complex thing we do.

Affectiva CEO: AI needs emotional intelligence to facilitate human-robot interaction


Affectiva, one in a series of companies to come out of MIT's Media Lab whose work revolves around affective computing, used to be best known for sensing emotion in videos. It recently expanded into emotion detection in audio with the Speech API for companies making robots and AI assistants. Affective computing, the use of machines to understand and respond to human emotion, has many practical uses. In addition to Affectiva, Media Lab nurtured Koko, a bot that detects words used on chat apps like Kik to recognize people who need emotional support, and Cogito, whose AI is used by the U.S. Department of Veteran Affairs to analyze the voices of military veterans with PTSD to determine if they need immediate help. Then there's Jibo, a home robot that mimics human emotion on its five-inch LED face that Time magazine recently declared one of the best inventions of 2017.