technology

Apple goes big on accessibility with new features that give you a voice


This year, Apple is introducing a fifth pillar of accessibility dedicated to speech (Picture: Apple)

Apple’s new range of accessibility features announced at this year’s WWDC highlights its commitment to staying on course to be inclusive.

Metro.co.uk caught up with Sarah Herrlinger, Apple’s senior director of accessibility and policy, to talk about the tech giant’s approach to accessibility and the new features coming this autumn.

Until now, Apple’s accessibility efforts revolved around four main pillars: vision, hearing, physical motor, and cognitive. This year, they are introducing a fifth pillar dedicated to speech.

One of the notable features launching this fall is ‘Personal Voice’, a voice banking tool developed in collaboration with top organisations working with ALS (amyotrophic lateral sclerosis) patients.

ALS, also known as Lou Gehrig’s disease, is a progressive condition that results in the deterioration of nerve cells controlling voluntary muscles. Speech loss is a common symptom, affecting around one in three individuals diagnosed with ALS.

The new feature will allow users at risk of losing their ability to speak to create their personalised voice on their device. This powerful tool takes about 15 minutes to set up and allows users to type sentences, phrases and save frequently used expressions.

Live Speech on iPhone, iPad, and Mac gives users the ability to type what they want to say and have it be spoken out loud during phone and FaceTime calls, as well as in-person conversations (Picture: Apple)

The upcoming release also includes the ‘Live Speech’ feature, enabling individuals who have lost their ability to speak to communicate through text-to-speech functionality.

This integrates seamlessly with Personal Voice so users can speak with their chosen voice on iPhone, iPad, and Mac, by typing what they want to say to have it be spoken aloud during the phonecalls, FaceTime, and in-person conversations.

Users can also save commonly used phrases to chime in quickly during conversations.

Readers Also Like:  We asked ChatGPT if AI is a threat to humanity and we’re more scared than before

When it comes to accessibility and privacy, Apple assures users it doesn’t ‘choose one over the other’. Personal Voice set-up is done completely on the device, and nothing is saved to the cloud. Live Speech is exclusively available on devices with passcode locks, ensuring the user retains control over their voice.

Apple said that it does not factor in uptake into its features like accessibility.

‘We do it because we believe it’s the right thing to do and if we’re doing this for communities that need it, it’s powerful,’ said Herrlinger.

Personal Voice allows users at risk of losing their ability to speak to create a voice that sounds like them (Picture: Apple)

‘If it positively affects one person’s life, it’s probably going to positively affect enough people’s lives that it’s worth it.’

But the tech giant can rest assured that extending their accessibility features to their ecosystem of devices is an attractive factor for adoption.

For example, the Apple Watch has undergone significant improvements to cater for individuals with various disabilities. The addition of watch controls based on muscle and tendon movements makes it easy for individuals with limited mobility to operate the device.

Another powerful feature Apple is introducing is ‘Point And Speak’, which makes it easier for users with vision disabilities to read text in their surroundings – for example, text on a microwave.

Point And Speak is built into the Magnifier app on iPhone and iPad, works great with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment.

With all these new features, Apple clearly wants to be the go-to for accessibility needs (Picture: Apple)

However, features in the detection mode work with a LiDAR Scanner built into Apple’s pro models on the iPhones and iPad so it will cost extra for now.

Readers Also Like:  Sequoia Capital resigns from board of crime app Citizen

The company’s accessibility features will also extend to Apple’s hottest announcement of the year, the Vision Pro.

Working with voiceover, voice controls and Braille input, it’s built to be ‘fluid in its ability to move from one modality to another’.

Apple Vision Pro should be an interesting one for accessibility as it introduces an entirely new input system controlled by a person’s eyes, hands, and voice.

The company is also ‘highly encouraging’ of developers looking to create accessibility apps on the new visionOS.

Accessibility features go beyond just releasing them, as evidenced by online communities and message boards filled with individuals and families looking for advice on how to use them. For them, Apple retail stores’ workshops on accessibility should be a good starting point or the company’s accessibility helpline.

With all these new features, Apple clearly wants to be the go-to for accessibility needs.

‘I think in some ways, with the work we’ve done, we already are,’ said Herrlinger. ‘If you look at public statistics, they tell us that 72 per cent of the blind community using mobile devices is using an iOS device.’

She also emphasised the importance of involving individuals with disabilities in the development process: ‘Our commitment to accessibility is deeply rooted in the mantra of “nothing about us without us”.

‘We employ individuals with disabilities who are daily users of the assistive technologies we create to ensure their voices are heard throughout the process.’


MORE : Apple’s $3,499 Vision Pro headset could ‘read your mind’


MORE : Apple unveils its first AR/VR headset, the Vision Pro

Readers Also Like:  'Tourists are rethinking their relationship with Earth'





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.