Google on Thursday began rolling out new Android capabilities designed to help users with speech and motor impairments navigate their devices and communicate with others. The new features were among about a dozen updates to the mobile operating system just announced. The new tools, called Camera Switches and Project Activate, use an Android phone's front-facing camera and machine learning to detect face and eye gestures. They effectively turn a front-facing camera into a switch -- an adaptive tool that replaces a keyboard, mouse or touchscreen functions. Camera Switches is a feature within the Android Accessibility suite that lets users navigate their phone with eye movements and facial gestures.
A screenshot image of Lookout's modes, including "Explore," "Shopping," and "Quick read", as well as a second screenshot of Lookout detecting a dog in the camera frame. Google has launched its Lookout app, which uses artificial intelligence (AI) to help the visually impaired see by pointing their phone at objects and receiving verbal feedback. Lookout uses similar underlying technology as Google Lens, Google said in a blog post, to provide feedback, earcons, or other continuous signals to the user. It also functions in the same way as Lens -- receiving information and providing feedback based on what is captured on the device's rear camera. The app reportedly assists users in situations such as learning about a new space for the first time, reading text and documents, and completing daily routines such as cooking, cleaning, and shopping, Google said.
Google launched its Lookout app in 2019 for people who are blind or have low-vision to navigate the world with their phones, but it was only available on Pixel phones with languages set to English. Today, the company is rolling out an update that not only adds French, Italian, German and Spanish to the list of supported languages, but also brings two new modes, a more accessible design and greater Android compatibility. The first of the new modes is Food Label, which helps users identify packaged foods by pointing their cameras at the label. Lookout will guide you to place the product in a way that lets it be recognized via its packaging or barcode. According to Scott Adams, product manager for Google's Accessibility Engineering, this would let Lookout "distinguish between a can of corn and a can of green beans," for example. The other mode is Scan Document, which as its name suggests can take a snapshot of a letter or other documents and read it aloud.
Google is testing a new accessibility feature on Android phones that allows users to control functions by using facial gestures instead of touching its screen. Part of the Android Accessibility Suite app, it allows users to connect an external device and control a number of functions, like'scroll forward,' 'scroll back,' 'home' and'notifications.' The update is intended to help people with disabilities, and who suffer from mobility issues which may make it harder for them to use a regular touch-screened device. Users gestures are scanned by the phone's camera and assigned to an action of their choice. Right now the list of usable facial gestures includes'Open Mouth,' 'Smile,' 'Raise Eyebrows,' 'Look Left,' 'Look Right' and'Look Up.'
The imperative to improve smartphone use for people with limited motor capabilities has resulted in some truly cool -- and hopefully helpful -- new features. Thursday, Google announced an expansion of its accessibility settings as well as a new app that will let people navigate their phones with facial gestures. The feature within the Android Accessibility Suite is called Camera Switches. Previously, Google let users who could not navigate phones with the touchscreens connect a manual switch device that let them scroll and select. Now, the new "switch" is an Android phone's camera and a person's face.