Identifying perceived emotions from people's walking style

#artificialintelligence

A team of researchers at the University of North Carolina at Chapel Hill and the University of Maryland at College Park has recently developed a new deep learning model that can identify people's emotions based on their walking styles. Their approach, outlined in a paper pre-published on arXiv, works by extracting an individual's gait from an RGB video of him/her walking, then analyzing it and classifying it as one of four emotions: happy, sad, angry or neutral. "Emotions play a significant role in our lives, defining our experiences, and shaping how we view the world and interact with other humans," Tanmay Randhavane, one of the primary researchers and a graduate student at UNC, told TechXplore. "Perceiving the emotions of other people helps us understand their behavior and decide our actions toward them. For example, people communicate very differently with someone they perceive to be angry and hostile than they do with someone they perceive to be calm and contented."


AI classifies people's emotions from the way they walk

#artificialintelligence

The way you walk says a lot about how you're feeling at any given moment. When you're downtrodden or depressed, for example, you're more likely to slump your shoulders than when you're contented or upset. Leveraging this somatic lexicon, researchers at the University of Chapel Hill and the University of Maryland recently investigated a machine learning method that can identify a person's perceived emotion, valence (e.g., negative or positive), and arousal (calm or energetic) from their gait alone. The researchers claim this approach -- which they believe is the first of its kind -- achieved 80.07% percent accuracy in preliminary experiments. "Emotions play a large role in our lives, defining our experiences and shaping how we view the world and interact with other humans," wrote the coauthors.


An AI has learned to predict people's moods from the way they walk โ€“ Fanatical Futurist by International Keynote Speaker Matthew Griffin

#artificialintelligence

Connect, download a free E-Book, watch a keynote, or browse my blog. Recently I discussed how Artificial Intelligence (AI) is helping analyse and diagnose everything from cancer and depression to dementia and PTSD, among many other things including even a person's personality and their intent to criminality using nothing more than a clever app. Now, in a next step, excusing the pun, a team of researchers have figured out how to categorise people's emotions using AI from the way they walk. And the tech could be used to gauge everything from the mood of shoppers, to the emotional state and mental health of an entire population. It's also not the only tech that can do this โ€“ you might be surprised to learn that AI can also turn your home Wi-Fi router into a radar spy that can analyse the state of your emotions and your health.


Modeling the Experience of Emotion

arXiv.org Artificial Intelligence

Affective computing has proven to be a viable field of research comprised of a large number of multidisciplinary researchers resulting in work that is widely published. The majority of this work consists of computational models of emotion recognition, computational modeling of causal factors of emotion and emotion expression through rendered and robotic faces. A smaller part is concerned with modeling the effects of emotion, formal modeling of cognitive appraisal theory and models of emergent emotions. Part of the motivation for affective computing as a field is to better understand emotional processes through computational modeling. One of the four major topics in affective computing is computers that have emotions (the others are recognizing, expressing and understanding emotions). A critical and neglected aspect of having emotions is the experience of emotion (Barrett, Mesquita, Ochsner, and Gross, 2007): what does the content of an emotional episode look like, how does this content change over time and when do we call the episode emotional. Few modeling efforts have these topics as primary focus. The launch of a journal on synthetic emotions should motivate research initiatives in this direction, and this research should have a measurable impact on emotion research in psychology. I show that a good way to do so is to investigate the psychological core of what an emotion is: an experience. I present ideas on how the experience of emotion could be modeled and provide evidence that several computational models of emotion are already addressing the issue.


How Long Until a Robot Cries? - Issue 60: Searches

Nautilus

When Angelica Lim bakes macaroons, she has her own kitchen helper, Naoki. Her assistant is only good at the repetitive tasks, like sifting flour, but he makes the job more fun. Naoki is very cute, just under two feet tall. He's white, mostly, with blue highlights, and has speakers where his ears should be. The little round circle of a mouth that gives him a surprised expression is actually a camera, and his eyes are infrared receivers and transmitters. "I just love robots," said Lim in 2013, at the time a Ph.D. student in the Department of Intelligent Science and Technology at Kyoto University in Japan.