Goto

Collaborating Authors

Emotional AI Makes Your Car Really Know How You Feel

#artificialintelligence

Imagine if your car could pull itself over when you're drowsy or nauseous, or adjust the temperature and music when gridlock is stressing you out. Maybe it could even refuse to start if it knows you're intoxicated. With advanced ADAS systems already in place and the days of autonomous vehicles on the horizon, a lot of work is being done around sensing and machine learning to help vehicles better understand the roads and the world around them. But Boston-based startup Affectiva thinks more needs to be done around the internal world of the car--specifically the emotional state of the driver. Affectiva has built its business model around creating "emotional AI," algorithms capable of recognizing human emotional states.


Aptiv Partners With Affectiva To Enable Next-Gen In-Vehicle Experience

#artificialintelligence

Aptiv has signed a commercial partnership agreement with Affectiva to deliver innovative, scalable software to enhance perception capabilities in advanced safety solutions, and reimagine the future of the in-cabin experience. Affectiva is a Boston-based MIT Media Lab spin-off and leader in Human Perception artificial intelligence (AI). This new software that aims to enhance in-vehicle experience will be derived from deep learning architectures, the company noted. Aptiv and Affectiva will be working closely in commercialising advanced sensing solutions for OEM and fleet customers, and to further support the commercial partnership, the former has made a minority investment in Affectiva. Affectiva's patented software is the first multi-modal interior sensing solution to unobtrusively identify complex cognitive states of vehicle occupants in real-time, Aptiv said.



Affectiva raises 14 million to bring apps, robots emotional intelligence

#artificialintelligence

Affectiva, a startup developing "emotion recognition technology" that can read people's moods from their facial expressions captured in digital videos, raised 14 million in a Series D round of funding led by Fenox Venture Capital. According to co-founder Rana el Kaliouby, the Waltham, Mass.-based company wants its technology to become the de facto means of adding emotional intelligence and empathy to any interactive product, and the best way for organizations to attain unvarnished insights about customers, patients or constituents. She explained that Affectiva uses computer vision and deep learning technology to analyze facial expressions or non-verbal cues in visual content online, but not the language or conversations in a video. The company's technology ingests digital images--including video in chat applications, live-streamed or recorded videos, or even GIFs--through simple web cams typically. Its system first categorizes then maps the facial expressions to a number of emotional states, like happy, sad, nervous, interested or surprised.


Deep learning tools help users dig into advanced analytics data

#artificialintelligence

At Twitter Inc., Hugo Larochelle's job is to develop an understanding of how users of the social network are connected to each other and what interests them in order to categorize and promote content that includes tweets, images and videos. To help accomplish that, he and his fellow data analysts use an emerging technology: deep learning tools. As Larochelle, a research scientist at Twitter, explained during a presentation at the Deep Learning Summit in Boston this month, deep learning is a category of machine learning that seeks to understand complex problems, such as interpreting images or text-based natural language. He and other proponents say deep learning techniques -- which lean heavily on the use of neural networks -- are more useful than traditional machine learning when data analytics applications involve unstructured data or require subjective interpretations. And deep learning is quickly becoming a hot field in the realm of advanced data analytics.