If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Aptiv has signed a commercial partnership agreement with Affectiva to deliver innovative, scalable software to enhance perception capabilities in advanced safety solutions, and reimagine the future of the in-cabin experience. Affectiva is a Boston-based MIT Media Lab spin-off and leader in Human Perception artificial intelligence (AI). This new software that aims to enhance in-vehicle experience will be derived from deep learning architectures, the company noted. Aptiv and Affectiva will be working closely in commercialising advanced sensing solutions for OEM and fleet customers, and to further support the commercial partnership, the former has made a minority investment in Affectiva. Affectiva's patented software is the first multi-modal interior sensing solution to unobtrusively identify complex cognitive states of vehicle occupants in real-time, Aptiv said.
Getting an autonomous vehicle to drive safely under idealized road conditions has technically been possible for a while now, but for the real world, the cars are going to have to learn to drive a little bit more like us. That's where Comma.ai, a startup founded by notorious iPhone hacker George Hotz, comes in. Rather than teaching its computer systems what a tree or a stop sign looks like, Comma.ai's Openpilot technology analyzes the patterns of everyday drivers to train its self-driving models. The company is pulling in millions of miles of driving data from a dashcam app called Chffr and a plug-in module called Panda, then aggregating that data to create an autonomous system that mimics human drivers.
Softbank Robotics today announced that its robot Pepper will now use emotion recognition AI from Affectiva to interpret and respond to human activity. Pepper is about four feet tall, gets around on wheels, and has a tablet in the center of its chest. The humanoid robot made its debut in 2015 and was designed to interact with people. Cameras and microphones are used to help Pepper recognize human emotions, like hostility or joy, and respond appropriately with a smile or indications of sadness. This type of intelligence likely comes in handy for the environments where Pepper operates, like banks, hotels, and Pizza Huts in some parts of Asia.
Our planet is an amazing place, full of life that defies expectations at every turn. There are other animals on Earth aside from humans that exhibit BOTH intelligence and sentience, in every way you might choose to interpret those definitions. Is intelligence unique to Earth? We may never know for sure, but science so far has shown us that it is not unique to humanity. Consider the bottlenose dolphin, a creature that shares a similarly large and complex brain with humans, which is capable of understanding numerical continuity and perhaps even discriminate between numbers.
Nuance Communications is already well known for its tech industry innovations. It's been at the forefront of speech recognition software and has also made substantial inroads into the automotive industry. In fact, you'll find its Dragon Drive software in more than a few cars out there on the roads. But the company also works in a stack of other business sectors including healthcare, telecommunications, financial services and even retail. Now, though, the company is working with Affectiva, an MIT Media Lab spin-off and a leading provider of AI software that detects complex and nuanced human emotions and cognitive states from face and voice.
In January of 2018, Annette Zimmermann, vice president of research at Gartner, proclaimed: "By 2022, your personal device will know more about your emotional state than your own family." Just two months later, a landmark study from the University of Ohio claimed that their algorithm was now better at detecting emotions than people are. AI systems and devices will soon recognize, interpret, process, and simulate human emotions. With companies like Affectiva, BeyondVerbal and Sensay providing plug-and-play sentiment analysis software, the affective computing market is estimated to grow to $41 billion by 2022, as firms like Amazon, Google, Facebook, and Apple race to decode their users' emotions. Emotional inputs will create a shift from data-driven IQ-heavy interactions to deep EQ-guided experiences, giving brands the opportunity to connect to customers on a much deeper, more personal level.
As I watched in my seat, I could hear the audience gasp at times, and I couldn't help but notice a couple of things: for one, there was this foregone assumption that AI is out to get us, and two, this field is still so incredibly dominated by men – white men specifically. Other than myself, there were two other women featured–compared to about a dozen males. But it wasn't just the numbers–it was the total air time. The majority of the time, the voice on screen was a male. I vowed that on stage that night, I would make my voice heard.
When a CIA-backed venture capital fund took an interest in Rana el Kaliouby's face-scanning technology for detecting emotions, the computer scientist and her colleagues did some soul-searching - and then turned down the money. 'We're not interested in applications where you're spying on people,' said el Kaliouby, the CEO and co-founder of the Boston startup Affectiva. The company has trained its artificial intelligence systems to recognize if individuals are happy or sad, tired or angry, using a photographic repository of more than 6 million faces. Rana el Kaliouby, CEO of Affectiva, demonstrates their facial recognition technology. Recent advances in AI-powered computer vision have accelerated the race for self-driving cars and powered the increasingly sophisticated photo-tagging features found on Facebook and Google.
Imagine if your car could pull itself over when you're drowsy or nauseous, or adjust the temperature and music when gridlock is stressing you out. Maybe it could even refuse to start if it knows you're intoxicated. With advanced ADAS systems already in place and the days of autonomous vehicles on the horizon, a lot of work is being done around sensing and machine learning to help vehicles better understand the roads and the world around them. But Boston-based startup Affectiva thinks more needs to be done around the internal world of the car--specifically the emotional state of the driver. Affectiva has built its business model around creating "emotional AI," algorithms capable of recognizing human emotional states.
Imagine a world in which machines interpret the emotional state of humans and adapt their behavior to give appropriate responses to those emotions. Well, artificial emotional intelligence, which is also known as emotion AI or affective computing, is already being used to develop systems and products that can recognize, interpret, process, and simulate human affects (with an "a," not an "e"). In psychology, an "affect" is a term used to describe the experience of feeling or emotion. If you've seen "Solo: A Star Wars Story", then you've seen the poster child for artificial emotional intelligence: L3-37. Lando Calrissian's droid companion and navigator (voiced by Phoebe Waller-Bridge) instigates a slave revolt to escape from Kessel, but is severely damaged during the diversion.