Javascript is not enabled.

Javascript must be enabled to use this site. Please enable Javascript in your browser and try again.

Skip to content
Content starts here
CLOSE ×

Search

Leaving AARP.org Website

You are now leaving AARP.org and going to a website that is not operated by AARP. A different privacy policy and terms of service will apply.

5 New Features from Apple, Google to Help People with Disabilities

Accessibility updates target blind users, those with memory or speech problems


spinner image An entrepreneur with a disability using a phone in an office
GETTY IMAGES

More than a billion people on the planet have a disability of one kind or another.

Apple and Google have already incorporated audio enhancements, screen readers, switch controls and other accessibility features and tools into modern smartphones and other technology. Yet the two leading developers of mobile operating systems continue to innovate with features designed to help folks with memory, physical, vision and other challenges.

spinner image Image Alt Attribute

AARP Membership

LIMITED TIME OFFER

Flash Sale! Join AARP today for $16 per year. Get instant access to members-only products and hundreds of discounts, a free second membership, and a subscription to AARP The Magazine.

Join Now

If you’re an iPhone user who has a disability or cares for someone with one, check out the tools and options on the handset you already own before exploring what’s new. Tap Settings | Accessibility to get started.

And if you have an Android phone such as a Google Pixel, you’ll want to start at Settings | Accessibility on that device.

In advance of Global Accessibility Awareness Day, observed May 18 every year, Apple offered a sneak peek of some accessibility features the company expects to bring out before the end of the year. The day is dedicated to providing inclusive digital access for anyone with disabilities, and Google followed with accessibility updates of its own.

1. Simplification is Assistive Access goal

Apple’s Assistive Access feature is aimed at people with concentration and memory challenges, often collectively called cognitive problems.

A simplified visual or grid layout surfaces the most typically used experiences on an iPhone or iPad, replacing the home and other screens on a device. Large onscreen buttons represent Calls, Camera, Messages, Music and Photos. An alternate row-based layout is available for people who are more comfortable scrolling.

Screens are customizable, so a caregiver can place other apps a person might use frequently, not just ones from Apple.

Underlying apps can be tailored, too. Tap Calls and you can display buttons for family members, close friends and other people called most often, whether via FaceTime or a voice call. If the wrong button is inadvertently tapped, the user can hit a Back button to stop the call from going through.

The Messages app also can be customized. And a person can record a video selfie to share that lets a friend know how the day is going. As part of the general Assistive Access experience, high contrast buttons and large text labels will be displayed.

Assistive Access brings to mind Google’s Action Blocks, released in 2020 for Android phones and tablets. Action Blocks lets a user create custom widgets that appear on the home screen that when tapped trigger Google Assistant commands.

Technology & Wireless

Consumer Cellular

5% off monthly fees and 30% off accessories

See more Technology & Wireless offers >

2. Live Speech can be your voice

If you have difficulty speaking or are at risk of losing your voice, the Live Speech feature coming to iPhones, iPads and Mac computers will let you type to be heard out loud. It can be used for calls or when conversing in person. It promises to work across multiple languages and with any built-in voice on a device, including Siri.

A feature called Personal Voice, developed with the nonprofit Team Gleason, will deliver your message in a voice close to your own. The organization is named after Steve Gleason, 46, a former New Orleans Saints football player, to help people with amyotrophic lateral sclerosis (ALS) after his own diagnosis in 2011.

A patient with ALS, also known as Lou Gehrig’s disease, has about a 1 in 3 chance of losing the ability to speak. Before that happens, a person can create a Personal Voice by reading a series of text prompts, a process that Apple says takes about 15 minutes.

Personal Voice uses machine learning, a type of artificial intelligence, but is confined to a single device, protecting user privacy. You will need a device with Apple’s own silicon chips.

Google has been working for a while on improving communication for people whose speech is impaired. In 2021, it launched an Android app called Project Relate to help them be better understood.

The app is still in beta development. To check it out, you can apply to be a beta test.

3. Point and Speak helps with low vision

You can use the Point and Speak feature on the iPhone and iPad Magnifier app to interact with appliances and other objects around the home that have text labels.

spinner image membership-card-w-shadow-192x134

Join AARP today for $16 per year. Get instant access to members-only products and hundreds of discounts, a free second membership, and a subscription to AARP The Magazine.

Someone who is blind or has low vision might point a phone at the microwave to read labels aloud: “power level,” “add 30 seconds,” etc. You will need an iPhone or iPad with a lidar scanner, shorthand for light detection and ranging. Lidar is included on Apple’s more expensive phones and tablets.

4. Lookout describes what your phone sees

Lookout for Android has been around for a few years. Like Apple’s Point and Speak feature, it is designed for people who are blind or don’t see well.

By using the sensors and cameras on your phone, Lookout can help people explore their environment and recognize objects and text through audio. Google is adding an image question and answer component to let folks use their voice or type to ask questions and gain a better understanding of what’s in an image.

The feature uses artificial intelligence and an advanced visual language model from Google DeepMind, a research lab company, owned by Google parent Alphabet. A select group of blind and low-vision people are already using the new Lookout feature, Google says. The company has plans to roll it out more broadly later this year.

5. Live Caption expands on Android, Chrome

The Live Caption feature that Google launched in 2019 displays real-time transcriptions of speech detected on audio messages, calls, podcasts and videos. But it’s been limited to Chromebooks, Chrome browsers and Pixel phones, and not all the devices understand all audio sources.

Beginning this summer, Google will expand Live Capture to additional Android devices. The Live Caption for Calls feature, which includes the ability to type responses during calls and have the response read aloud to the other caller, will be available on the latest Pixel devices, expand to Pixel 4 and 5 models and show up on Android phones from OnePlus, Samsung, Sony and others.

Discover AARP Members Only Access

Join AARP to Continue

Already a Member?