There's a new AI that can guess how you feel just by watching you walk
So is it possible to interpret how someone is feeling based on their gait alone? That's exactly what scientists at the University of North Carolina at Chapel Hill and the University of Maryland at College Park have taught a computer to do. Using deep learning, their software can analyze a video of someone walking, turn it into a 3D model, and extract their gait. A neural network then determines the dominant motion and how it matches up to a particular feeling, based on the data on which it's trained. According to their research paper, published in June on arXiv, their deep learning model can guess four different emotions--happy, sad, angry, and neutral--with 80% accuracy.
Jul-22-2019, 06:05:14 GMT
- Country:
- North America > United States
- Maryland (0.27)
- North Carolina (0.30)
- North America > United States
- Genre:
- Research Report (0.60)
- Industry:
- Education > Educational Setting > Higher Education (0.66)
- Technology: