Artificial intelligence can influence your travel plans, anticipate your online orders and improve efficiency around the house. In the entertainment sphere, there's an AI technology to make you feel more while watching a movie. Researchers at the MIT Media Lab were able to train a machine to manipulate the emotional responses of video viewers. Their findings were published in collaboration with strategic consulting firm McKinsey & Company. The MIT team examined the neural perceptions of thousands of people who watched movies, videos, online features and television programs in sections.
Researchers have found that dogs tend to match their emotional state to the emotional sounds they hear in humans and other dogs. Dogs responded similarly to emotional sounds from both humans and dogs, and expressed negative emotional states when they heard negative emotional sounds. The research indicates that dogs are able to distinguish between positive and negative emotional sounds and are sensitive to emotions in humans. Emotional contagion is a basic component of empathy, defined as an automatic and unconscious emotional state-matching between two individuals. Emotional contagion has been demonstrated in various animal species ranging from primates to rodents.
Jeffrey was the CEO of a hedge fund, and he was upset about some poor trades that Tom, one of his portfolio managers, made. He called Tom into his office. "Those trades were a terrible idea! The conversation quickly went downhill. With that first question, it would have been hard for it to go any other way. Why was it a bad way to start? "What were you thinking?" is a past-focused question. When Tom explains his thinking to Jeffrey, he'll reinforce his mistake and sound defensive because his thinking was problematic and led to poor results. He doesn't necessarily think the same way now, of course. Tom will explain why he made that trade, and Jeffrey will get angry at his poor judgment. Then they'll both leave the conversation frustrated and disheartened (which is, predictably, exactly what happened). What could Jeffrey have done differently? A better choice would have been to avoid talking about the past and, instead, ask Tom about the future: "How will you do it differently next ...
AI assistants may be called "personal" but they definitely aren't personable. Never mind their obviously fake personalities, these intelligent chatbots are really intelligent in only the factual sense. Huawei, however wants AI assistants to grow beyond that to become something more relatable, more approachable, more human-like. In other words, it wants to make its AI have some EI, emotional intelligence, as well to help identify human emotions and, if needed, console their users. Considering what Huawei is going through, it might be in need of some of that emotional support itself.