More than 20% of U.S. adults live with some form of disability, according to a September 2015 report released by the U.S. Centers for Disease Control and Prevention. The latest generation of smartphones, tablets, and personal computers are equipped with accessibility features that make using these devices easier, or at least, less onerous, for those who have sight, speech, or hearing impairments. These enhancements include functions such as screen-reading technology (which reads aloud text when the user passes a finger over it); screen-flashing notification when a call or message comes in for the hearing impaired; and voice controls of basic functions for those who are unable to physically manipulate the phone or computing device's controls. Other technologies that can help the disabled have or are coming to market, and not all of them are focused simply on providing access to computers or smartphones. Irrespective of the accessibility provided, most market participants agree more needs to be done to help those with disabilities to fully experience our increasingly digital world.
"Connected to other part," my iPhone says to me as I stand somewhere in London's Soho, trying to decipher the letter on the top of a bus stop. "Hello?" says an American woman, reminding me of Scarlett Johansson's disembodied artificially intelligent character from the sci-fi film Her. "Hey, er … can you give me a hand by reading the letter on the bus stop?" "Sure … can you move your phone a bit more up, and to the left … Ya! I thank her, end the session, pull up Citymapper and navigate my way onto the 453 going to New Cross. I have a little bit of vision, but only enough to see motion and movement.
If the measure of progress in technology is that devices should become ever smaller and more capable, then OrCam Technologies is on a roll. The Israeli firm's OrCam MyEye, which fits on the arm of a pair of glasses, is far more powerful and much smaller than its predecessor. With new AI-based Smart Reading software released in July, the device not only "reads" text and labels but also identifies people by name and describes other important aspects of the visual world. It also interacts with the user, principally people who are blind or visually impaired, by means of an AI-based smart voice assistant. At the upcoming Sight Tech Global virtual event, we're pleased to announce that OrCam's co-founder and co-CEO, Professor Amnon Shashua, will be a featured speaker.
Mobile devices have become incredibly popular for their ability to weave modern conveniences such as Internet access and social networking into the fabric of daily life. For people with disabilities, however, these devices have the potential to unlock unprecedented new possibilities for communication, navigation and independence. This emergence of mobile "assistive" technologies, influenced heavily by the passage of the Americans with Disabilities Act (ADA) 25 years ago, marks a major step forward for people with disabilities. The U.S. Congress passed the ADA in July 1990 as a civil rights law to protect people with disabilities from discrimination. The act requires that businesses, schools and government agencies must follow certain requirements to ensure people have equal access to their services and facilities.
For blind and visually impaired people like me, accessibility is the difference between us being able to use a website and clicking off it. Screen readers allow blind and visually impaired people to use computers, phones and tablets independently. Most screen readers use software, and a Text To Speech (TTS) engine, which is what converts the text from the screen reader into speech. Screen readers convert the text displayed on screen into a format that blind users can process. Screen readers read out loud everything that's on the screen and allow people to navigate using touch gestures and shortcut keys.