In January of 2018, Annette Zimmermann, vice president of research at Gartner, proclaimed: "By 2022, your personal device will know more about your emotional state than your own family." Just two months later, a landmark study from the University of Ohio claimed that their algorithm was now better at detecting emotions than people are. AI systems and devices will soon recognize, interpret, process, and simulate human emotions. With companies like Affectiva, BeyondVerbal and Sensay providing plug-and-play sentiment analysis software, the affective computing market is estimated to grow to $41 billion by 2022, as firms like Amazon, Google, Facebook, and Apple race to decode their users' emotions. Emotional inputs will create a shift from data-driven IQ-heavy interactions to deep EQ-guided experiences, giving brands the opportunity to connect to customers on a much deeper, more personal level.
Implanting a microchip into your brain to unlock its full potential may sound like the plot from the latest science fiction blockbuster. But the futuristic technology could become a reality within 15 years, according to Bryan Johnson, an expert working on such a device. The chips will allow people to buy and delete memories, and will soon be as popular as smartphones, Mr Johnson claims. Implanting a microchip into your brain to unlock its full potential may sound like the plot from the latest science fiction blockbuster. Kernel is currently working on prototypes of a brain implant device for medical use in humans.
Scientists have developed a revolutionary device that lets paralyzed people use their thoughts to perform tasks they never thought they'd be able to do again. A brain-computer interface, called BrainGate, has demonstrated how it can help paraplegics in the past, but new research shows that the device can be hooked up to a basic tablet to send text messages, show the weather forecast and even play a digital piano. Volunteers with severe paralysis had to have a chip the size of a baby aspirin implanted into their brains in order to use the tablet. Participants have a'baby aspirin'-sized chip implanted into their brain's motor cortex. The motor cortex is a part of the brain that is responsible for the planning, control and execution of involuntary movements.
Researchers at MIT's Computer Science and Artificial Intelligence Laboratory have created a device that can read human emotions using wireless signals. The EQ-Radio reads subtle changes in breathing and heart rhythms to figure out if a person is happy, excited, angry or sad. The device measures heartbeats like an ECG monitor with a margin of error of 0.3 percent. It then analyzes the waveforms within each heartbeat to determine the person's emotion. "Our work shows that wireless signals can capture information about human behavior that is not always visible to the naked eye," project lead Dina Katabi, who co-wrote a paper on the topic with PhD students Mingmin Zhao and Fadel Adib, said in a statement Tuesday.