Language Learning

This shuttle bus will serve people with vision, hearing, and physical impairments--and drive itself


It's been 15 years since a degenerative eye disease forced Erich Manser to stop driving. Today, he commutes to his job as an accessibility consultant via commuter trains and city buses, but he has trouble locating empty seats sometimes and must ask strangers for guidance. A step toward solving Manser's predicament could arrive as soon as next year. Manser's employer, IBM, and an independent carmaker called Local Motors are developing a self-driving, electric shuttle bus that combines artificial intelligence, augmented reality, and smartphone apps to serve people with vision, hearing, physical, and cognitive disabilities. The buses, dubbed "Olli," are designed to transport people around neighborhoods at speeds below 35 miles per hour and will be sold to cities, counties, airports, companies, and universities.

'SignAloud' gloves translate sign language movements into spoken English

Daily Mail

For people living in a world without sound, sign language can make sure their points of view are heard. But outside of the deaf and hard-of-hearing communities, this gesture-based language can lose its meaning. Now a pair of entrepreneurial technology students in the US has designed a pair of gloves to break down the communication barriers, by translating hand gestures into speech. US inventors have designed a pair of gloves, called'SignAloud', which translate the gestures of sign language to spoken English. The gloves (pictured) use embedded sensors to monitor the position and movement of the user's hands, while a central computer analyses the data and converts gestures to speech Called'SignAloud', the gloves use embedded sensors to monitor the position and movement of the user's hands.