Eyeris Brings Emotion Recognition to Automobiles

#artificialintelligence

At NVIDIA's recent GPU Technology Conference, we sat down with the CEO of Eyeris Technologies, Modar (JR) Alaoui, to discuss giving automobiles the ability to interpret human emotions. Eyeris is bringing to market a feature that has not been present in the 130 year history of the automobile. That feature is ability of the car to monitor and register the emotional state of the driver. The product will be able to record the emotions as shown by the driver's face, age identification, gender identification, eye tracking, and gaze estimation of drivers and passengers. "Our goal is to put our software in the back of every camera in the world," said Alouai.


Feeling sad, angry? Your future car will know - Roadshow

#artificialintelligence

Did you know that passengers invariably show a fear reaction when the brakes are applied in a car? That is just one of the things facial monitoring company Eyeris learned when developing its Emovu Driver Monitoring System (DMS). Using a combination of cameras, graphic processing and deep learning, Emovu analyzes the passengers in a car, determining from facial movement which of seven emotions these passengers are feeling. Modar JR Alaoui, CEO of Eyeris, demonstrated the company's in-car technology during Nvidia's GTC developer conference, putting forth a few ideas of how monitoring the emotions of drivers can lead to safer driving. The company used deep learning to train its Emovu software to recognize facial expressions.


Move aside, backseat driver! New tech at CES monitors...

Daily Mail

Cars are getting smarter - and while many focus on seeing the road ahead, they are also set to begin analyzing drivers and passengers. This week at CES, the international consumer electronics show in Las Vegas, a host of startup companies are showing off inward facing cameras that watch and analyze drivers, passengers and objects in cars. Carmakers say they will boost safety - but privacy campaigners warn they could be used to make money by analyzing every movement - even being able to track a passenger's gaze to see what ads they are looking at, and monitor the emotions of people through their facial expressions. Occupants, inside a car, are seen on a monitor using technology by Silicon Valley company Eyeris, which uses cameras and AI to track drivers and passengers for safety benefits, shown during an interview in San Jose, California, U.S., December 28, 2018. Carmakers could gather anonymized data and sell it.


Artificial Intelligence in Autonomous Driving

#artificialintelligence

The development of the most advanced driver assistance systems (ADAS) in the industry should be based on integrated and open platforms. A complete solution is required for development, simulation, prototyping, and implementation to enable smarter, more sophisticated ADAS, and to pave the way for the autonomous car. This article summarizes the current status of DNN-based deep learning architectures built on top of a supercomputer on wheels, which are integrated in platforms to drive the future of autonomous vehicles. Deep learning is the most popular approach to develop AI. It is a way to enable machines to recognize and understand the world they are intended to operate in.


Identifying Traffic Signs with Deep Learning

@machinelearnbot

Successful detection and classification of traffic signs is one of the important problem to be solved if we want self driving cars. Idea is to make automobiles smart enough so as to achieve least human interaction for successful automation. Swift rise in dominance of deep learning over classical machine learning methods which is complemented by advancement in GPU (Graphics Processing Unit) has been astonishing in fields related to image recognition, NLP, self-driving cars etc.