Apple previewed a suite of new features today to improve cognitive, vision and speech accessibility. These tools are slated to arrive on the iPhone, iPad and Mac later this year. An established leader in mainstream tech accessibility, Apple emphasizes that these tools are built with feedback from disabled communities.
Assistive Access, coming soon to iOS and iPadOS, is designed for people with cognitive disabilities. Assistive Access streamlines the interface of the iPhone and iPad, specifically focusing on making it easier to talk to loved ones, share photos and listen to music. The Phone and FaceTime apps are merged into one, for example.
The design also is made more digestible by incorporating large icons, increased contrast and clearer text labels to make the screen more simple. However, the user can customize these visual features to their liking, and those preferences carry across any app that is compatible with Assistive Access.
As part of the existing Magnifier tool, blind and low vision users can already use their phone to locate nearby doors, people or signs. Now Apple is introducing a feature called Point and Speak, which uses the device’s camera and LiDAR Scanner to help visually disabled people interact with physical objects that have several text labels.
So, if a low vision user wanted to heat up food in the microwave, they could use Point and Speak to discern the difference between the “popcorn,” “pizza” and “power level” buttons — when the device identifies this text, it reads it out loud. Point and Speak will be available in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese and Ukrainian.
A particularly interesting feature from the bunch is Personal Voice, which creates an automated voice that sounds like you, rather than Siri. The tool is designed for people who may be at risk of losing their vocal speaking ability from conditions like ALS. To generate a Personal Voice, the user has to spend about 15 minutes reading randomly chosen text prompts clearly into their microphone. Then, using machine learning, the audio is processed locally on your iPhone, iPad or Mac to create your Personal Voice. It sounds similar to what Acapela has been doing with its “my own voice” service, which works with other assistive devices.
It’s easy to see how a repository of unique, highly trained text to speech models could be dangerous in the wrong hands. But according to Apple, this custom voice data is never shared with anyone, even Apple itself. In fact, Apple says doesn’t even connect your personal voice with your Apple ID, as some households might share a log-in. Instead, users must opt in if they want a Personal Voice they make on their Mac to be accessible on their iPhone, or vice versa.
At launch, Personal Voice will only be available for English speakers, and can only be created on devices with Apple silicon.
Whether you’re speaking as Siri or your AI voice twin, Apple is making it easier for non-verbal people to communicate. Live Speech, available across Apple devices, lets people type what they want to say so that it can be spoken aloud. The tool is available at the ready on the lock screen, but it can also be used in other apps, like FaceTime. Plus, if users find themselves often needing to repeat the same phrases — like a regular coffee order, for example — they can store preset phrases within Live Speech.
Apple’s existing speech-to-text tools are getting an upgrade, too. Now, Voice Control will incorporate phonetic text editing, which makes it easier for people who type with their voice to quickly correct errors. So, if you see your computer transcribe “great,” but you meant to say “grey,” it will be easier to make that correction. This feature, Phonetic Suggestions, will be available in English, Spanish, French and German for now.
These accessibility features are expected to roll out across various Apple products this year. As for its existing offerings, Apple is expanding access to SignTime to Germany, Italy, Spain and South Korea on Thursday. SignTime offers users on-demand sign language interpreters for Apple Store and Apple Support customers.