If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Researchers have combined speech and facial recognition data to improve the emotion detection abilities of AIs. The ability to recognise emotions is a longstanding goal of AI researchers. Accurate recognition enables things such as detecting tiredness at the wheel, anger which could lead to a crime being committed, or perhaps even signs of sadness/depression at suicide hotspots. Nuances in how people speak and move their facial muscles to express moods have presented a challenge. Detailed in a paper (PDF) on Arxiv, researchers at the University of Science and Technology of China in Hefei have made some progress.
A spy movie with its paraphernalia of cool gadgets and technologies has always enticed audiences. In these movies, we have seen the use of a polygraph to detect if somebody is being truthful or not. Needless to say, polygraph is a multi-billion dollar industry and plays a crucial role in crime adjudication. Polygraphs do not have any "intelligence" built into them. They are simple machines that do what they were designed to do: measure vital statistics like blood pressure and pulse to reach a conclusion.
Systems that can classify a person's emotion from their voice and facial tics alone are a longstanding goal of some AI researchers. Firms like Affectiva, which recently launched a product that scans drivers' faces and voices to monitor their mood, are moving the needle in the right direction. But considerable challenges remain, owing to nuances in speech and muscle movements. Researchers at the University of Science and Technology of China in Hefei claim to have made progress, though. In a paper published on the preprint server Arxiv.org this week ("Deep Fusion: An Attention Guided Factorized Bilinear Pooling for Audio video Emotion Recognition"), they describe an AI system that can recognize a person's emotional state with state-of-the-art accuracy on a popular benchmark.
"Every aspect of our lives will be transformed [by AI]", potentially "the biggest event in the history of our civilization" -Stephen Hawking We are already seeing the tremendous inroads that Artificial Intelligence (AI) has made in virtually every industry. Despite AI's rapid expansion, the Artificial Intelligence technology itself is still evolving. AI points towards a future where machines not only do physical work, as they have done since the industrial revolution, but also the "thinking" work – planning, strategizing, prioritizing and making decisions. In fact, the definition of what is considered Artificial Intelligence keeps shifting. What used to be called AI even several years ago is now just widely used and familiar technology, and no longer resides under the AI umbrella.
It was with a strangely deflated feeling in his gut that Harvard biologist Mohammed AlQuraishi made his way to Cancun for a scientific conference in December. Strange because a major advance had just been made in his field, something that might normally make him happy. Deflated because the advance hadn't been made by him or by any of his fellow academic researchers. It had been made by a machine. DeepMind, an AI company that Google bought in 2014, had outperformed all the researchers who'd submitted entries to the Critical Assessment of Structure Prediction (CASP) conference, which is basically a fancy science contest for grown-ups.
Sentiment Analysis is already widely used by different companies to gauge consumer mood towards their product or brand in the digital world. However, in the offline world, users are also interacting with the brands and products in retail stores, showrooms, etc., and solutions to measure users' reactions automatically under such settings has remained a challenging task. Emotion detection from facial expressions using AI can be a viable alternative to automatically measure consumers' engagement with their content and brands. In this post, we will discuss how such a technology can be used to solve a variety of real-world use-cases effectively. Car manufacturers around the world are increasingly focusing on making cars more personal and safe for us to drive.
As humans, you and I can look at these two chats and determine that in the first the person appears to be sincere while in the other comes off sarcastic and cold simply due to the way it was punctuated. This may seem fairly obvious. Yet for chatbots and natural language processing algorithms, these two responses tend to appear identical. When taken in the literal sense, there's no reason to assume that both people had the same reaction. Removing our understanding of social cues hinders our ability to discern the true intention of the message.
In Katherine Cross' short story "Machine of Loving Grace" -- the final installment in our Better Worlds anthology -- Alexandra and Phoebe must deal with their creation Ami, an artificial intelligence that was designed to moderate online communities, as it fights fire with fire. Cross is a sociologist and a gaming and social critic who is working on her PhD at the City University of New York Graduate Center, specializing in the study of gender and online harassment. Her work has appeared in The Establishment, The Guardian, Gamasutra, Time magazine, and The Verge. The Verge spoke with Cross about how artificial intelligence requires empathy and the importance of moderating online spaces. This interview has been lightly edited for clarity.
IBM's most advanced AI machine challenges a record-setting human debate champion.Gallo Communications Aristotle broke ground by claiming that the art of persuasion--rhetoric--can be learned. Although Aristotle was right, he may never have envisioned a day when a machine could also be taught to argue. It took 2,300 years for it happen, but it happened. On Monday, February 11, I had a front row seat for the first live debate between human and machine. For six years, scientists at IBM Research have been working on the next big milestone for artificial intelligence (AI).
You've heard that computers don't understand human emotions well, so people should focus less on basic skills and more on social and emotional learning. Artificial intelligence (AI) is actually already brilliant at understanding and engaging (even manipulating) people's emotions and social interactions in powerful ways. Facebook is a giant social and emotional learning engine. Facebook has many of your emotional memories (your photos and videos), it knows who you care about socially (your friends that you interact with), and it knows what you prefer (by what you "like"). It brings these three things together at an incredible scale to decide what goes into your feeds to engage you both socially and emotionally.