SEMAINE-3.1 – semaine

#artificialintelligence

Sensitive Artificial Listeners (SAL) are virtual dialogue partners who, despite their very limited verbal understanding, intend to engage the user in a conversation by paying attention to the user's emotions and non-verbal expressions. The SAL characters have their own emotionally defined personality, and attempt to drag the user towards their dominant emotion, through a combination of verbal and non-verbal expression. The SEMAINE project has created an open-source implementation of fully autonomous SAL characters. It combines state-of-the-art emotion-oriented technology into a real-time interactive system. The SAL characters register your voice from a microphone, using a combination of speech recognition and vocal emotion recognition.


What Makes Automatic Emotion Detection So Powerful?

#artificialintelligence

The one and only reason why businesses are turning to automatic emotion detection is you! Emotion sensing technologies are expanding exponentially. Market researchers estimate the Emotion Detection & Recognition (EDR) business to grow at a compound annual growth rate (CAGR) of 27.20–39.9%, One of the most common ways to automatically recognize emotions is via facial detection in photos and videos. The list of softwares or APIs that allow you to do that keeps on getting longer.


AI Series - Part Two - Programming Emotions

#artificialintelligence

Well done in kickstarting Azure Cognitive Services Emotions API. Remember that Emotions API(Project Oxford) is still in "Preview Stage" so not all your images are meant to work (Tried like 10 happiness emotion images and only 1 got processed).


Russia claims to be building A.I. that can feel "true" emotion - Automation Watch

#artificialintelligence

For developers of artificial intelligence, the goal does not get much more grandiose than successfully creating a machine capable of feeling "real emotion". And according to Moscow-based Professor Alexei Samsonovich, Russia is on the verge of doing just that. Samsonovich recently announced that he expects a breakthrough in the next several years which will see the rise of "free thinking" machines capable of understanding human emotions, as well as feeling their own emotions. The human brain is devastatingly complex and machines are currently not capable of expressing what we would consider to be "human emotion". The announcement hints at robots ultimately being able to understand narratives of thinking, as well as being developed enough to foster their own narratives.