The Surprising Things Algorithms Can Glean About You From Photos


Algorithms can also identify the emotion you're feeling in a photo or video. Decades ago, Paul Ekman, professor emeritus at the University of California–San Francisco, observed that people around the world made distinct facial expressions, some of them lasting less than a second, in response to specific emotionally charged situations. More recently, Ekman served as an adviser to a San Diego company called Emotient, a company acquired by Apple in 2016 that developed software to identify emotional sentiments from camera feeds in real time. With a single high-resolution camera, Emotient's algorithms can simultaneously "read" the emotional microexpressions on the faces of 400 people gathered in an area--say, a lecture hall or shopping mall. Emotient is working on adapting its algorithms for use in hospitals to detect pain on patients' faces.

Ken Denman, Emotient: our micro-expressions reveal what you're really thinking


"Since the beginning of time we've failed to understand how people feel," Ken Denman, president and CEO of engagement and emotion analysis firm Emotient told the audience at WIRED Retail. "The reality is, we've just been guessing." WIRED Retail is our annual exploration of the ever-changing world of commerce, featuring leading technologists, entrepreneurs and creatives innovating in sectors as diverse as robotics, virtual reality and the future of home delivery.

How Apple stomped on Intel's plans to make RealSense emotionally smart


Intel has grand plans for computers that will recognize human emotion using its RealSense 3D camera, but Apple appears to have dealt it a setback. RealSense uses a combination of infrared, laser and optical cameras to measure depth and track motion. It's been used on a drone that can navigate its own way through a forest, for example. It can also detect changes in facial expressions, and Intel wanted to give RealSense the ability to read human emotions by combining it with an emotion recognition technology developed by Emotient. Emotient's plug-in allowed RealSense to detect whether people are happy or sad by analyzing movement in their lips, eyes and cheeks.

Google glass could read your emotions - and send them to advertisers

AITopics Original Links

A new app for Google Glass claims to be able to tell you exactly how someone is feeling. Called Emotient, is can tell whether a person is happy, sad angry or confused - and can monitor an entire room of people at once. The firm says the technology could even be used by advertisers to gauge reactions to their products. The Emotient software processes facial expressions and provides an aggregate emotional read-out, measuring overall sentiment (positive, negative or neutral); primary emotions (joy, surprise, sadness, fear, disgust, contempt and anger); and advanced emotions (frustration and confusion). The Emotient software detects and processes anonymous facial expressions of individuals and groups that the Glass wearer sees to determine an aggregate sentiment read-out; it does not store video or images.

Google Creates Fail-Safe for Stopping Dangerous A.I.


Companies have started to explore how artificial intelligence (A.I.) and robotics could be useful to their customers. Facebook deployed an algorithm that can sort through thousands of posts per second in order to deliver the best content for its users, whereas Apple acquired an artificial intelligence startup named Emotient back in January to potentially use Emotient's technology to support facial recognition features on a future version of the iPhone. However, what would happen if these helpful programs went rogue? Google DeepMind, a subsidiary of the tech giant specializing in A.I. research, in conjunction with the Future of Humanity Institute published a study explaining that it would take more than just unplugging a computer to stop a malfunctioning program. Basically, researchers designing these algorithms would need to install something called an "interruption policy, " according to Popular Science.