AI that understands speech by looking as well as hearing

#artificialintelligence 

People use AI for a wide range of speech recognition and understanding tasks, from enabling smart speakers to developing tools for people who are hard of hearing or who have speech impairments. But oftentimes these speech understanding systems don't work well in the everyday situations when we need them most: Where multiple people are speaking simultaneously or when there's lots of background noise. Even sophisticated noise-suppression techniques are often no match for, say, the sound of the ocean during a family beach trip or the background chatter of a bustling street market. One reason why people can understand speech better than AI in these instances is that we use not just our ears but also our eyes. We might see someone's mouth moving and intuitively know the voice we're hearing must be coming from her, for example.