iOS 14 is filled with accessibility improvements.

iOS 14 is filled with accessibility improvements.

iOS 14 Apple iPhone

From sign language in FaceTime calls to sound recognition.

Apple’s new operating systems — like iOS 14 and tvOS 14, which are due to be released later this year — include numerous features that should make them easier to use by people with disabilities. Apple announced the new features as part of its Worldwide Developers Conference this week, and Forbes and CNET have rounded many of them up.

These improvements range from new features like sound recognition to improvements to Apple’s existing accessibility features like its VoiceOver screen reader. It’s a substantial list that should make Apple’s products easier to use by those with hearing, sight, motor disabilities, or others.

Sound recognition in iOS 14, for example, will let you tell your phone to constantly listen out for 14 different sounds, including doorbells, sirens, smoke detector alarms, or a crying baby. It’s a feature that could be helpful for people who are hard of hearing or deaf by making them aware of critical sounds earlier than they might have done otherwise. (Apple warns against relying on the feature in “high-risk or emergency situations,” however.)

Then there’s iOS 14’s new Back Tap feature. Twitter users were quick to point out that you can use the feature to make it easier to launch Google’s voice assistant if you’d rather not speak to Siri. But as Forbes notes, the more important aspect of this accessibility feature is that it can be used to replace screen gestures that might be tricky to perform for people with cognitive or motor disabilities. You could tap the back of your phone to access the notification center rather than stretching your thumb to swipe down, for example, or even set up more complex actions using shortcuts.

There’s a trend running throughout many of these features, which is although they’re designed to make devices easier to use for people with disabilities, they can also have benefits for everyone else. People with disabilities should always be the focus when designing accessibility features, but their benefits can be much wider-ranging.

Next up, FaceTime, which will now be able to detect when someone is using sign language, and automatically make that person the focus, making their signing easier to see. tvOS will soon work with Microsoft’s Xbox Adaptive Controller — a controller specifically designed for people with disabilities.

There’s also a new “headphone accommodations” feature in iOS 14, which adjusts the sound frequencies streamed through select Apple and Beats headphones to better match your hearing. Apple says the new accessibility feature should make “music, movies, phone calls, and podcasts” listened to using the headphones “sound more crisp and clear.” It also works with the AirPods Pro’s transparency mode to help make quiet voices around you more audible.

As well as big new features like these, Apple is making a host of other updates to its existing accessibility features. Its VoiceOver screen reader, for example, will now be able to recognize and describe more of what it sees on-screen, like reading text from images or photos. Apple’s Magnifier and voice control options have also been updated, and CNET notes that some of its Xcode coding tools are being updated to make them more accessible.

Apple isn’t the first company to have introduced features like these, and other companies are making big strides of their own (Android 11, for example, will deliver big upgrades to Android’s voice controls), but its commitment to introducing and then refining its accessibility options should be applauded.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest