Jason Cipriani is based out of beautiful Colorado and has been covering technology news and reviewing the latest gadgets as a freelance journalist for the past 13 years. His work can be found all across the Internet and in print. We're only a few weeks away from the opening keynote of Apple's annual Worldwide Developer Conference, where we'll see the company preview upcoming software features and changes for the iPhone and the rest of its hardware lineup. However, on Tuesday, Apple gave us a preview of several new accessibility features the company states will arrive on the iPhone, iPad and Apple Watch later this year -- presumably in iOS 16 and iPadOS 16. For users who are blind or have low vision, a new Door Detection uses an iPhone or iPad's Lidar Scanner and the rear camera to identify a doorway, giving the user details such as how far away it is and describing any details about the door the user needs to know -- such as if it's open, closed or whether the door opens in or out, and what kind of handle it has.
Global Accessibility Awareness Day is this Thursday (May 19th) and Apple, like many other companies, is announcing assistive updates in honor of the occasion. The company is bringing new features across iPhone, iPad, Mac and Apple Watch, and the most intriguing of the lot is systemwide Live Captions. Similar to Google's implementation on Android, Apple's Live Captions will transcribe audio playing on your iPhone, iPad or Mac in real time, displaying subtitles onscreen. It will also caption sound around you, so you can use it to follow along conversations in the real world. You'll be able to adjust the size and position of the caption box, and also choose different font sizes for the words. The transcription is generated on-device, too.
Apple is putting the LiDAR sensor on its iPhone 12 Pro and iPhone 12 Pro Max to good use. In the latest public beta of its mobile platform, iOS 14.2, the company introduced a new Accessibility feature called People Detection. It's a part of the iPhone's Magnifier tool, which can be turned on under Settings - Accessibility - Magnifier, after which it shows up as a new app in the phone's App Gallery. The People Detection feature, which is activated by tapping on the rightmost icon in Magnifier's UI, uses LiDAR and the phone's camera to detect people in the camera's field of view. The feature is primarily meant as aid for the visually impaired, giving them a way to assess whether people are nearby, as well as notify them when they move closer.
US tech giants Amazon and Apple have announced new accessibility features for their technology aimed to help people with impaired vision. Amazon's new feature, called Show and Tell, helps blind and partially sighted people identify common household grocery items. The feature, which launches in the UK today, works with Amazon's Echo Show range – devices that combine a camera and a screen with a smart speaker that's powered by its digital assistant Alexa. Apple, meanwhile, has redesigned its dedicated accessibility site to make it easier for iPhone and iPad owners to find vision, hearing and mobility tools for everyday life. These include People Detection, which uses the iPhone's built-in LiDAR scanner to prevent blind users colliding with other people or objects.
Apple's products have a wide variety of accessibility features, which can help people with disabilities personalize these products to better suit their needs. On Thursday – December 3, which is the International Day of Persons with Disabilities – the company launched a redesigned Accessibility hub, detailing some of these features in products such as the iPhone, Apple Watch, Mac and iPad. The site is divided in four sections: Vision, Mobility, Hearing, and Cognitive, each with several examples of features that may be useful to people with disabilities. The new hub has examples of features such as Magnifier, Zoom, Voice Control and more. Each feature page lists all the devices where that feature is available, and provides direct links to set it up on different devices.