Machine Learning is a very hot topic these days. Getting started can be fast and easy. In this video post, I walk through the steps to build a simple Universal Windows Application (UWP) that connects to the Microsoft Cognitive Services and the Emotion API. The Microsoft Cognitive Services are a set of APIs that enable your apps to leverage powerful algorithms using just a few lines of code. They work across lots of various devices and platforms such as iOS, Android, and Windows, keep improving and are easy to set up.
AI bias is in the news – and it's a hard problem to solve When AI engages with humans – how does AI know what humans really means? In other words, why is it hard for AI to detect human bias? That's because humans do not say what they really mean due to factors such as cognitive dissonance. Cognitive dissonance refers to a situation involving conflicting attitudes, beliefs or behaviours. This produces a feeling of mental discomfort leading to an alteration in one of the attitudes, beliefs or behaviours to reduce the discomfort and restore balance.
The past 15 years have witnessed a rapid growth in computational models of emotion and affective architectures. Researchers in cognitive science, AI, HCI, robotics, and gaming are developing'models of emotion' for theoretical research regarding the nature of emotion, as well as a range of applied purposes: to create more believable and effective synthetic characters and robots, and to enhance human-computer interaction. Yet in spite of the many stand-alone emotion models, and the numerous affective agent and robot architectures developed to date, there is a lack of consistency, and lack of clarity, regarding what exactly it means to'model emotions'. 'Emotion modeling' can mean the dynamic generation of emotion via black-box models that map specific stimuli onto associated emotions. It can mean generating facial expressions, gestures, or movements depicting specific emotions in synthetic agents or robots.