AI-equipped backpack designed for the blind 'speaks' to provide users with directions

Daily Mail - Science & tech 

Researchers at the University of Georgia have developed a wearable AI engine that can help visually impaired people navigate the world around them. Users receive audio directions and advisories from a Bluetooth-enabled earphone, while a battery in a fanny pack provides about eight hours of energy. Intel, which provided the processing power for the prototype device, says it's superior to other high-tech visual-assistance programs, which'lack the depth perception necessary to facilitate independent navigation.' Jagadish Mahendran, an AI developer at the University of Georgia's Institute for Artificial Intelligence, was inspired to create the system by a visually impaired friend. 'I was struck by the irony that, while I have been teaching robots to see, there are many people who cannot see and need help,' he said. Using OpenCV's Artificial Intelligence Kit, he developed a program that runs on a laptop small enough to stow in a backpack, linked to Luxonis OAK-D spatial AI cameras in a vest jacket that provide obstacle and depth information.