On Wednesday, Apple announced a bunch of new accessibility features coming to iPhones, iPads, and the Apple Watch. The new features and services will roll out in the coming days, weeks, and months.
The first feature to arrive will be a new service called SignTime, which Apple says will launch tomorrow, May 20. SignTime will allow users to communicate with Apple’s customer service representatives (either AppleCare or Retail Customer Care) using sign language. The service will launch first in the US, UK, and France with American Sign Language, British Sign Language, and French Sign Language, respectively. Further, customers at Apple Stores will be able to use SignTime to get in touch with an interpreter while shopping or getting customer support without having to make an appointment in advance.
While SignTime’s arrival is right around the corner, software updates loaded with new features aimed at making Apple’s software and hardware more accessible for people with cognitive, mobility, hearing, and vision disabilities will hit Apple’s platforms sometime later this year.
For example, users will be able to navigate the Apple Watch using AssistiveTouch, which was already available on iPad and iPhone to help users who have difficulty touching the screen. The feature will use the Watch’s motion sensors and heart-rate sensor to “detect subtle differences in muscle movement and tendon activity” as input that allows users to move a cursor on the display.
Apple also plans to roll out big improvements to its VoiceOver screen-reader for blind or low-vision users. Apple claims that VoiceOver will be able to read off the contents of a receipt in a more useful way. VoiceOver will also be able to describe the position of people and objects within photos. In its newsroom post on the subject, Apple provides the following as an example of what VoiceOver might say to describe part of a photo: “Slight right profile of a person’s face with curly brown hair smiling.”
iPadOS will also soon support third-party MFi eye-tracking devices. Users with disabilities use these devices to control computing devices with their eyes. A pointer follows a user’s gaze, and actions can be performed when the user maintains eye contact on a UI element for a short period.
And while we’re on the subject of MFi devices, Apple is adding support for new bidirectional hearing aids to assist with hands-free phone and FaceTime conversations.
Because some users can be distracted or overwhelmed by background sounds, Apple is introducing a new feature for neurodiverse users: calming background sounds that play throughout the OS (like ocean or rain sounds) to block out external noise that might cause discomfort. “The sounds mix into or duck under other audio and system sounds,” Apple says.
Apple hasn’t detailed when these new features will materialize—only that they’re coming with new software updates later this year.
Listing image by Samuel Axon