Machine learning enhances non-verbal communication in online classrooms
June 21, 2021--Researchers in the Center for Research on Entertainment and Learning (CREL) at the University of California San Diego have developed a system to analyze and track eye movements to enhance teaching in tomorrow's virtual classrooms – and perhaps future virtual concert halls. UC San Diego music and computer science professor Shlomo Dubnov, an expert in computer music who directs the Qualcomm Institute-based CREL, began developing the new tool to deal with a downside of teaching music over Zoom during the COVID-19 pandemic. "In a music classroom, non-verbal communication such as facial affect and body gestures is critical to keep students on task, coordinate musical flow and communicate improvisational ideas," said Dubnov. "Unfortunately, this non-verbal aspect of teaching and learning is dramatically hampered in the virtual classroom where you don't inhabit the same physical space." To overcome the problem, Dubnov and Ph.D. student Ross Greer recently published a conference paper on a system that uses eye tracking and machine learning to allow an educator to make'eye contact' with individual students or performers in disparate locations – and lets each student know when he or she is the focus of the teacher's attention.
Jun-21-2021, 22:45:24 GMT
- Country:
- North America > United States > California > San Diego County > San Diego (0.52)
- Genre:
- Press Release (0.40)
- Industry:
- Technology: