At NVIDIA's recent GPU Technology Conference, we sat down with the CEO of Eyeris Technologies, Modar (JR) Alaoui, to discuss giving automobiles the ability to interpret human emotions. Eyeris is bringing to market a feature that has not been present in the 130 year history of the automobile. That feature is ability of the car to monitor and register the emotional state of the driver. The product will be able to record the emotions as shown by the driver's face, age identification, gender identification, eye tracking, and gaze estimation of drivers and passengers. "Our goal is to put our software in the back of every camera in the world," said Alouai.
Did you know that passengers invariably show a fear reaction when the brakes are applied in a car? That is just one of the things facial monitoring company Eyeris learned when developing its Emovu Driver Monitoring System (DMS). Using a combination of cameras, graphic processing and deep learning, Emovu analyzes the passengers in a car, determining from facial movement which of seven emotions these passengers are feeling. Modar JR Alaoui, CEO of Eyeris, demonstrated the company's in-car technology during Nvidia's GTC developer conference, putting forth a few ideas of how monitoring the emotions of drivers can lead to safer driving. The company used deep learning to train its Emovu software to recognize facial expressions.
Artificial intelligence was on the tip of the tongue this week at CES, the annual technology extravaganza formerly known as the Consumer Electronics Show. From Samsung's Neon avatars and LG's smart washing machine, to Intel's Tiger Lake processors and the gun-detecting PATSCAN, AI seemed to be everywhere. Samsung's research subsidiary, STAR Labs, unveiled its latest AI project, called Neon. Similar to a chatbot, Neon generates a photo-realistic digital avatar that interacts with people in real time. The South Korean technology giant plans to weave the Neons into people's day-to-day lives, where the avatars will play the role of doctors, personal trainers, and TV anchors giving you the evening news.
Cars are getting smarter - and while many focus on seeing the road ahead, they are also set to begin analyzing drivers and passengers. This week at CES, the international consumer electronics show in Las Vegas, a host of startup companies are showing off inward facing cameras that watch and analyze drivers, passengers and objects in cars. Carmakers say they will boost safety - but privacy campaigners warn they could be used to make money by analyzing every movement - even being able to track a passenger's gaze to see what ads they are looking at, and monitor the emotions of people through their facial expressions. Occupants, inside a car, are seen on a monitor using technology by Silicon Valley company Eyeris, which uses cameras and AI to track drivers and passengers for safety benefits, shown during an interview in San Jose, California, U.S., December 28, 2018. Carmakers could gather anonymized data and sell it.
"Cooperation in the form of generosity has been observed to be contagious, with receipt of donations positively influencing their subsequent generosity," the researchers wrote. It plays into our survival instincts: I'll scratch your back, you scratch mine. Constantly evaluating our own and other peoples' social standings is a hard workout for our brains -- we have the largest cerebral cortex relative to our body size of all mammals -- as we've evolved to process social and communication norms. These findings could also be useful for artificial intelligence algorithms like self-driving cars or other machine learning systems that need to interact with other bots. In these one-off interactions, they'll "need to self-manage their behavior but at the same time cooperate with others in their environment."