jibo
Social Robots as Social Proxies for Fostering Connection and Empathy Towards Humanity
Shen, Jocelyn, Lee, Audrey, Alghowinem, Sharifa, Adkins, River, Breazeal, Cynthia, Park, Hae Won
Despite living in an increasingly connected world, social isolation is a prevalent issue today. While social robots have been explored as tools to enhance social connection through companionship, their potential as asynchronous social platforms for fostering connection towards humanity has received less attention. In this work, we introduce the design of a social support companion that facilitates the exchange of emotionally relevant stories and scaffolds reflection to enhance feelings of connection via five design dimensions. We investigate how social robots can serve as "social proxies" facilitating human stories, passing stories from other human narrators to the user. To this end, we conduct a real-world deployment of 40 robot stations in users' homes over the course of two weeks. Through thematic analysis of user interviews, we find that social proxy robots can foster connection towards other people's experiences via mechanisms such as identifying connections across stories or offering diverse perspectives. We present design guidelines from our study insights on the use of social robot systems that serve as social platforms to enhance human empathy and connection.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.14)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- (5 more...)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
- Questionnaire & Opinion Survey (1.00)
- Information Technology > Security & Privacy (1.00)
- Health & Medicine > Consumer Health (0.93)
- Media (0.86)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology > Mental Health (0.46)
Socially Assistive Robots: A Technological Approach to Emotional Support
Yee, Leanne Oon Hui, Fun, Siew Sui, Zin, Thit Sar, Aung, Zar Nie, Yap, Kian Meng, Teoh, Jiehan
In today's high-pressure and isolated society, the demand for emotional support has surged, necessitating innovative solutions. Socially Assistive Robots (SARs) offer a technological approach to providing emotional assistance by leveraging advanced robotics, artificial intelligence, and sensor technologies. This study explores the development of an emotional support robot designed to detect and respond to human emotions, particularly sadness, through facial recognition and gesture analysis. Utilising the Lego Mindstorms Robotic Kit, Raspberry Pi 4, and various Python libraries, the robot is capable of delivering empathetic interactions, including comforting hugs and AI-generated conversations. Experimental findings highlight the robot's effective facial recognition accuracy, user interaction, and hug feedback mechanisms. These results demonstrate the feasibility of using SARs for emotional support, showcasing their potential features and functions. This research underscores the promise of SARs in providing innovative emotional assistance and enhancing human-robot interaction.
A HeARTfelt Robot: Social Robot-Driven Deep Emotional Art Reflection with Children
Pu, Isabella, Nguyen, Golda, Alsultan, Lama, Picard, Rosalind, Breazeal, Cynthia, Alghowinem, Sharifa
Social-emotional learning (SEL) skills are essential for children to develop to provide a foundation for future relational and academic success. Using art as a medium for creation or as a topic to provoke conversation is a well-known method of SEL learning. Similarly, social robots have been used to teach SEL competencies like empathy, but the combination of art and social robotics has been minimally explored. In this paper, we present a novel child-robot interaction designed to foster empathy and promote SEL competencies via a conversation about art scaffolded by a social robot. Participants (N=11, age range: 7-11) conversed with a social robot about emotional and neutral art. Analysis of video and speech data demonstrated that this interaction design successfully engaged children in the practice of SEL skills, like emotion recognition and self-awareness, and greater rates of empathetic reasoning were observed when children engaged with the robot about emotional art. This study demonstrated that art-based reflection with a social robot, particularly on emotional art, can foster empathy in children, and interactions with a social robot help alleviate discomfort when sharing deep or vulnerable emotions.
- North America > United States > New York > New York County > New York City (0.28)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.14)
- North America > United States > Illinois > Cook County > Chicago (0.04)
- (2 more...)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology (0.68)
- Education > Curriculum > Subject-Specific Education (0.47)
Making life friendlier with personal robots
Sharifa Alghowinem, a research scientist in the Media Lab's Personal Robots Group, poses with Jibo, a friendly robot companion developed by Professor Cynthia Breazeal. "As a child, I wished for a robot that would explain others' emotions to me" says Sharifa Alghowinem, a research scientist in the Media Lab's Personal Robots Group (PRG). Growing up in Saudi Arabia, Alghowinem says she dreamed of coming to MIT one day to develop Arabic-based technologies, and of creating a robot that could help herself and others navigate a complex world. In her early life, Alghowinem faced difficulties with understanding social cues and never scored well on standardized tests, but her dreams carried her through. She earned an undergraduate degree in computing before leaving home to pursue graduate education in Australia.
- Asia > Middle East > Saudi Arabia (0.28)
- Oceania > Australia (0.25)
- Asia > Middle East > Syria (0.05)
A Brief History of Adorable, Vaguely Creepy Robot Dogs
Amazon unveiled a long-awaited home robot on Tuesday, and he may or may not be a good boy. Like an extremely advanced puppy, "Astro" is designed to move around the home and assist its owner with small tasks like checking whether the stove is on, playing music, and delivering drinks. The robot can also recognize the faces of certain people and is equipped with a periscope camera that it can raise to get a better view of its surroundings. Amazon says that it will be available sometime later this year on an invite-only basis for $999. Astro is about 20 pounds and two feet tall, about the size of a small dog.
- North America > United States > Massachusetts (0.05)
- North America > United States > Arizona (0.05)
- Asia > Japan (0.05)
Can AI make us more HUMAN? - NASSCOM Community
If a robot dies, would it make you sad? For lots of people, the answer could be “yes”. A recent relevant example is that of an AI-powered available robot called Nao. When under an environment of controlled experimentation 89 people were asked to turn it off, most of them refused to do so because Nao pleaded to stay on. In another case, an American marketing executive had been used to sharing her home office with Jibo. For most of the time, she found it dumb and annoying. However, when soon after Jibo’s makers announced its death sentence, the same executive felt sorry for him. These and many other similar references tell us something really important about our emotional response to machines. AI is supposed to be driven by objective criterion through which an agent’s behavior is measured. This fact might make us believe that the science of AI is far away from comprehending subjective feelings. However, the truth is that using the aspects of Affective Computing, considerable progress has been made in making our systems ready for understanding, inducing and emulating human emotions. While we can still claim to have an upper hand, machines are gaining ground using their own strengths. If nothing else, the early signs of success are definitely extremely promising. As and when AI powered machines become the new norm, they do have the power to read our emotions better than other humans- sounds creepy? Consider the following areas in which it could be used: Retail: AI being used to evaluate emotions could revolutionize in-person service. Devices such as microphones, cameras or facial scanners can be installed in the stores to detect a buyer’s expression while shopping. One example could be of frustration lurking on his face and immediately after a human or a robot comes to his rescue. Hospitality: Imagine that you’re agitated about a restaurant’s slow service. At the table, a small AI-equipped computer with some sensors could evaluate your facial expressions or voice, note your distress, and signal for an employee to assist you. If the computer tagged you as particularly angry, the restaurant could offer a free treat. Online shopping: If you’re scrolling through a website for the perfect outfit, for instance, your computer could use its forward-facing camera to pick up subtle facial cues — like furrowed eyebrows or slight pouts. The site could then use that information, combined with data from your previous browsing behavior, to offer you options you might like. Call centers: Agents identify the moods of customers on the phone and adjust how they handle the conversation in real-time. Voice-analytics software can be used to gauge voice patterns and come-up with an objective parameter of caller’s emotions. Mental health: A platform for this purpose can make use of speech analyzer patterns to evaluate the speaker’s voice and look up for signs of anxiety as well as mood swings. It can also make use of bots to improve users’ self-awareness, and help them to cope up with the increasing amount of stress. The prospect of omnipresent AI scanning faces and listening to voices sounds intrusive, therefore companies will have to put rigorous security and privacy measures in place to protect customers’ information. History has shown that worries related to new technology fade as its benefits emerge. People constantly evaluate the emotions of customers, colleagues and loved ones to make decisions. Robots simply automate this process- and the more data they have, the better they will be at it!
- Information Technology (0.57)
- Health & Medicine > Therapeutic Area > Psychiatry/Psychology > Mental Health (0.57)
Bringing artificial intelligence into the classroom, research lab, and beyond
Artificial intelligence is reshaping how we live, learn, and work, and this past fall, MIT undergraduates got to explore and build on some of the tools and coming out of research labs at MIT. Through the Undergraduate Research Opportunities Program (UROP), students worked with researchers at the MIT Quest for Intelligence and elsewhere on projects to improve AI literacy and K-12 education, understand face recognition and how the brain forms new memories, and speed up tedious tasks like cataloging new library material. Six projects are featured below. Nicole Thumma met her first robot when she was 5, at a museum. "It was incredible that I could have a conversation, even a simple conversation, with this machine," she says.
Top 10 Biggest Failures Of AI In 2019
Becoming data-driven and driving the AI-first strategy is the ultimate objective of most companies today as they are gearing towards a digital transformation journey. While the final results are gratifying, the journey of analytics or AI adoption can be slow. As a result, many analytics projects and startups ultimately fail to scale up or stand the test of time. In the last year, there have been several reports that suggested that a majority of data science projects will face failure. In fact, one report said that 87% of data science projects fail to move past the preliminary stages.
- North America > United States > Texas (0.05)
- North America > United States > California (0.05)
- Europe > United Kingdom (0.05)
- (2 more...)
- Information Technology (1.00)
- Health & Medicine > Therapeutic Area > Oncology (0.52)
Autism and artificial intelligence: Visiting scholar probes human-robot interaction
Lundy Lewis, an academic and researcher in artificial intelligence and human-robot interaction, is watching a pair of six year-old boys playing with social robots in the gym at CHEO's site for autism in Kanata. Griffin and James Beck are twins. The robot they're interacting with is called Jibo, developed at the Massachusetts Institute of Technology. Jibo has no arms or legs and only two joints, one which approximates a neck and another a waist. Despite this, Jibo can pack a lot of emotion into his rotund body. Equipped with facial recognition and a touch screen, Jibo turns his head towards people.
- North America > United States > Massachusetts (0.25)
- Europe > France (0.05)
- Asia > Japan (0.05)
- Health & Medicine > Therapeutic Area > Neurology > Autism (0.84)
- Health & Medicine > Therapeutic Area > Genetic Disease (0.73)
Reach Robotics is closing up shop – TechCrunch
Reach Robotics, the company behind the spider-like MekaMon robot you might've seen on the shelves at the Apple Store, is closing down. Billed as the "world's first gaming robot," MekaMon is part video game, part STEM tool. You could plop it down on the carpet and point your phone at it to battle virtual augmented reality enemies, face off against other MekaMon owners in multiplayer battles or build custom programs for the robot on top of Apple's Swift Playgrounds. Here's a video we did on Reach Robotics a few years back: Reach Robotics was founded in 2013. They released their MekaMon robot in November of 2017, just a few months after raising a $7.5 million Series A. The consumer robotics sector is an inherently challenging space – especially for a start-up.
- Europe > Middle East (0.06)
- Asia > Middle East (0.06)
- Africa > Middle East (0.06)