Oxford University's lip-reading AI is more accurate than humans, but still has a way to go

#artificialintelligence 

Even professional lip-readers can figure out only 20% to 60% of what a person is saying. Slight movements of a person's lips at the speed of natural speech are immensely difficult to reliably understand, especially from a distance or if the lips are obscured. And lip-reading isn't just a plot point in NCIS: It's an essential tool to understand the world for the hearing-impaired, and if automated reliably, could help millions. A new paper (pdf) from the University of Oxford (with funding from Alphabet's DeepMind) details an artificial intelligence system, called LipNet, that watches video of a person speaking and matches text to the movement of their mouth with 93.4% accuracy. The previous state of the art system operated word-by-word, and had an accuracy of 79.6%.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found