Artificial (Emotional) Intelligence

Communications of the ACM

Anyone who has been frustrated asking questions of Siri or Alexa--and then annoyed at the digital assistant's tone-deaf responses--knows how dumb these supposedly intelligent assistants are, at least when it comes to emotional intelligence. "Even your dog knows when you're getting frustrated with it," says Rosalind Picard, director of Affective Computing Research at the Massachusetts Institute of Technology (MIT) Media Lab. "Siri doesn't yet have the intelligence of a dog," she says. Yet developing that kind of intelligence--in particular, the ability to recognize human emotions and then respond appropriately--is essential to the true success of digital assistants and the many other artificial intelligences (AIs) we interact with every day. Whether we're giving voice commands to a GPS navigator, trying to get help from an automated phone support line, or working with a robot or chatbot, we need them to really understand us if we're to take these AIs seriously.


Study will ask 10,000 New Yorkers to share life's data

Daily Mail - Science & tech

Wanted: 10,000 New Yorkers interested in advancing science by sharing a trove of personal information, from cellphone locations and credit-card swipes to blood samples and life-changing events. Researchers are gearing up to start recruiting participants from across the city next year for a study so sweeping it's called'The Human Project .' It aims to channel different data streams into a river of insight on health, aging, education and many other aspects of human life. Pictured are people walking inside the Oculus, the new transit station at the World Trade Center in New York. Researchers are gearing up to start recruiting 10,000 New Yorkers early next year for a study so sweeping it's called'The Human Project' 'That's what we're all about: putting the holistic picture together,' says project director Dr Paul Glimcher, a New York University neural science, economics and psychology professor.


Handling Representation Changes by Autistic Reasoning

AAAI Conferences

We discover the patterns of autistic reasoning in the conditions requiring change in representation of domain knowledge. The formalism of nonmonotonic logic of defaults is used to simulate the autistic decision-making while learning how to adjust an action to the environment which forces new representation structure. Our main finding is that while autistic reasoning may be able to process single default rules, they have a characteristic difficulty in cases with nontrivial representation changes, where multiple default rules conflict. We evaluate our hypothesis that the skill of representation adjustment can be advanced by learning default reasoning patterns via a set of exercises.


Companion-Based Ambient Robust Intelligence (CARING)

AAAI Conferences

We present a Companion-based Ambient Robust INtelliGence (CARING) system, for communication with, and support of, clients with Traumatic brain injury (TBI) or Amyotrophic Lateral Sclerosis (ALS). A central component of this system is an artificial companion, combined with a range of elements for ambient intelligence. The companion acts as a personalized intermediary for multi-party communication between the client, the environment (e.g. a Smart Home), caregivers and health professionals. CARING is based on tightly coupled systems drawing from natural language processing, speech recognition and adaptation, deep language understanding and constraint-based knowledge representation and reasoning. A major innovation of the system is its ability to adapt and accommodate different interfaces associated with different client capabilities and needs. The system will use, as a proxy, different interaction requirements of clients (e.g., Brain-Computer Interfaces) at different stages of ALS progression and with different types of TBI impairments. Ultimately, this technology is expected to improve the quality of life for clients through conversation with a computer.