Apple unveils iOS 19 accessibility features: Magnifier for Mac, App Store labels, more Ahead of WWDC kicking off in June, Apple today has officially unveiled this year’s new accessibility features for iPhone, iPad, Mac, Apple Watch, and..."> Apple unveils iOS 19 accessibility features: Magnifier for Mac, App Store labels, more Ahead of WWDC kicking off in June, Apple today has officially unveiled this year’s new accessibility features for iPhone, iPad, Mac, Apple Watch, and..." /> Apple unveils iOS 19 accessibility features: Magnifier for Mac, App Store labels, more Ahead of WWDC kicking off in June, Apple today has officially unveiled this year’s new accessibility features for iPhone, iPad, Mac, Apple Watch, and..." />

Upgrade to Pro

Apple unveils iOS 19 accessibility features: Magnifier for Mac, App Store labels, more




Ahead of WWDC kicking off in June, Apple today has officially unveiled this year’s new accessibility features for iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro.
The year’s features come as Apple celebrates 40 years of accessibility innovation, with the company originally starting its office of disability in 1985.
“At Apple, accessibility is part of our DNA,” Apple CEO Tim Cook said.
“Making technology for everyone is a priority for all of us, and we’re proud of the innovations we’re sharing this year.
That includes tools to help people access crucial information, explore the world around them, and do what they love.”
These features are destined for Apple’s forthcoming iOS 19 and macOS 16 updates, which will be announced at WWDC next month.
Apple, however, does not specifically refer to ‘iOS 19’ prematurely.
This year’s new accessibility features include Accessibility Nutrition Labels on the App Store, a new Magnifier app for Mac, major updates to Apple’s Personal Voice feature, and more.

Accessibility Nutrition Labels on the App Store
Headlining Apple’s announcement of new iOS 19 accessibility features is an upgrade for the App Store.
Later this year, Apple will add a new Accessibility Nutrition Labels section to App Store listings.
This is similar to the Privacy Nutrition Labels feature that Apple launched several years ago, giving users a way to easily see the privacy practices of individual apps.

The new Accessibility Nutrition Labels will show users which accessibility features an app supports prior to downloading it.
This includes VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduced Motion, captions, and more.
Eliel Johnson, Vice President of User Experience and Design at CVS Health, praised the addition of Accessibility Nutrition Labels in a statement to 9to5Mac:

“At CVS Health we are passionate about making health care simpler and more accessible.
We want to make it as easy as possible for consumers to find the health care information they need.
By supporting Apple’s new accessibility nutrition labels, we’re providing more transparency and elevating the work our CVS Health teams do to create great experiences for all consumers.”

Apple says it will share more details on Accessibility Nutrition Labels for developers later this year.
Magnifier for Mac

In conjunction with iOS 19, Apple is bringing its Magnifier app to the Mac for the first time this year with macOS 16.
The Magnifier app has been available on iPhone and iPad since 2016 as a way to give users who are blind or have low vision the ability to zoom in, read text, and detect objects around them.
The new Magnifier app for Mac works with an iPhone connected via Continuity Camera or an attached USB camera.
Once connected, users can zoom in on their surroundings and in the Magnifier app on their Mac.
They can also manipulate the video feed to adjust things like perspective, brightness, contrast, colors, and more to make it easier to read.
The Magnifier app can even recognize text.
For example, the Magnifier app on Mac can be used to zoom in on a whiteboard in a meeting or lecture.
Then, the app can intelligently recognize the handwritten text on that whiteboard and make it more legible and easier to read directly on the user’s Mac.
Accessibility Reader

Accessibility Reader is a new feature that will be available system-wide to make text easier to read.
Apple explains:

Accessibility Reader is a new systemwide reading mode designed to make text easier to read for users with a wide range of disabilities, such as dyslexia or low vision.
Available on iPhone, iPad, Mac, and Apple Vision Pro, Accessibility Reader gives users new ways to customize text and focus on content they want to read, with extensive options for font, color, and spacing, as well as support for Spoken Content.
Accessibility Reader can be launched from any app, and is built into the Magnifier app for iOS, iPadOS, and macOS, so users can interact with text in the real world, like in books or on dining menus.

Braille Access




Apple touts that its new Braille Access experience can turn a user’s Apple device “into a full-featured braille note taker that’s deeply integrated into the Apple ecosystem.”
“With a built-in app launcher, users can easily open any app by typing with Braille Screen Input or a connected braille device,” Apple explains.
“With Braille Access, users can quickly take notes in braille format and perform calculations using Nemeth Braille, a braille code often used in classrooms for math and science.”
Additionally, Apple says that users can open Braille Ready Format (BRF) files directly from Braille Access.
This will unlock a “wide range of books and files previously created on a braille note taking device.” The feature also ties in with Apple’s powerful Live Captions feature to allow users to transcribe conversations in real-time on braille displays.
Live Captions on Apple Watch

Speaking of Live Captions, watchOS 11 coming later this year will bring Live Listen controls to Apple Watch for the first time.
As a refresher, Live Listen first came to the iPhone with iOS 12 in 2018.
The feature uses an iPhone’s microphone to stream content directly to AirPods and Made for iPhone hearing aids so it’s easier for a user to hear.
With watchOS 11 this year, Live Listen controls will now be available on Apple Watch, including the ability to remotely start or stop Live Listen sessions, jump back in a session to catch something that might have been missed, and more.
There’s also support for real-time Live Captions, allowing users to follow along with the conversation via live transcripts directly on their Apple Watch.
Vision Pro upgrades
Apple also touts new accessibility features coming to Apple Vision Pro this year:

For users who are blind or have low vision, visionOS will expand vision accessibility features using the advanced camera system on Apple Vision Pro.
With powerful updates to Zoom, users can magnify everything in view — including their surroundings — using the main camera.
For VoiceOver users, Live Recognition in visionOS uses on-device machine learning to describe surroundings, find objects, read documents, and more.
For accessibility developers, a new API will enable approved apps to access the main camera to provide live, person-to-person assistance for visual interpretation in apps like Be My Eyes, giving users more ways to understand their surroundings hands-free.

Personal Voice upgrades

Apple debuted its revolutionary Personal Voice feature as part of iOS 17 in 2023.
The feature is a way for people who are at risk of losing their ability to speak to create and save a voice that sounds like them.
Apple says it’s designed for people at risk of losing their ability to speak, such as those with a recent diagnosis of ALS.
The feature then ties into Live Speech, which allows users to type what they want to say and have it be spoken out in their voice.
The setup process for the initial version of Personal Voice required that users say 150 different phrases to train Apple’s machine learning model.
The voice was then processed overnight.
With iOS 19 this year, however, Apple has completely revamped the setup process.
Now, users will only need to record 10 different phrases and process in under a minute rather than in multiple hours overnight.
The end result is a voice that is “smoother” and “more natural-sounding,” according to Apple.
The Personal Voice feature will also add support for Spanish (Mexico) this year, Apple says.
More accessibility features coming this year

Those features are just the tip of the iceberg.
Apple has a long list of other new capabilities coming to its platforms later this year, including upgrades to Eye Tracking, Background Sounds, Sound Recognition, and much more.

Background Sounds becomes easier to personalize with new EQ settings, the option to stop automatically after a period of time, and new actions for automations in Shortcuts.
Background Sounds can help minimize distractions to increase a sense of focus and relaxation, which some users find can help with symptoms of tinnitus.
For users at risk of losing their ability to speak, Personal Voice becomes faster, easier, and more powerful than ever, leveraging advances in on-device machine learning and artificial intelligence to create a smoother, more natural-sounding voice in less than a minute, using only 10 recorded phrases.
Personal Voice will also add support for Spanish (Mexico).


Vehicle Motion Cues, which can help reduce motion sickness when riding in a moving vehicle, comes to Mac, along with new ways to customize the animated onscreen dots on iPhone, iPad, and Mac.
Eye Tracking users on iPhone and iPad will now have the option to use a switch or dwell to make selections.
Keyboard typing when using Eye Tracking or Switch Control is now easier on iPhone, iPad, and Apple Vision Pro with improvements including a new keyboard dwell timer, reduced steps when typing with switches, and enabling QuickPath for iPhone and Vision Pro. 
With Head Tracking, users will be able to more easily control iPhone and iPad with head movements, similar to Eye Tracking.
For users with severe mobility disabilities, iOS, iPadOS, and visionOS will add a new protocol to support Switch Control for Brain Computer Interfaces (BCIs), an emerging technology that allows users to control their device without physical movement. 
Assistive Access adds a new custom Apple TV app with a simplified media player.
Developers will also get support in creating tailored experiences for users with intellectual and developmental disabilities using the Assistive Access API.
Music Haptics on iPhone becomes more customizable with the option to experience haptics for a whole song or for vocals only, as well as the option to adjust the overall intensity of taps, textures, and vibrations.
Sound Recognition adds Name Recognition, a new way for users who are deaf or hard of hearing to know when their name is being called.
Voice Control introduces a new programming mode in Xcode for software developers with limited mobility.
Voice Control also adds vocabulary syncing across devices, and will expand language support to include Korean, Arabic (Saudi Arabia), Turkish, Italian, Spanish (Latin America), Mandarin Chinese (Taiwan), English (Singapore), and Russian. 
Live Captions adds support to include English (India, Australia, UK, Singapore), Mandarin Chinese (Mainland China), Cantonese (Mainland China, Hong Kong), Spanish (Latin America, Spain), French (France, Canada), Japanese, German (Germany), and Korean. 



Updates to CarPlay include support for Large Text.
With updates to Sound Recognition in CarPlay, drivers or passengers who are deaf or hard of hearing can now be notified of the sound of a crying baby, in addition to sounds outside the car such as horns and sirens. 
Share Accessibility Settings is a new way for users to quickly and temporarily share their accessibility settings with another iPhone or iPad.
This is great for borrowing a friend’s device or using a public kiosk in a setting like a cafe.

You can find Apple’s full rundown of its new accessibility features coming this year on its website.
My favorite iPhone accessories:
Follow Chance: Threads, Bluesky, Instagram, and Mastodon. 


Add 9to5Mac to your Google News feed. 


FTC: We use income earning auto affiliate links.
More.You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day.
Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop.
Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel
Source: https://9to5mac.com/2025/05/13/apple-unveils-ios-19-accessibility-features/
#apple #unveils #ios #accessibility #features #magnifier #mac #app #store #labels
Apple unveils iOS 19 accessibility features: Magnifier for Mac, App Store labels, more
Ahead of WWDC kicking off in June, Apple today has officially unveiled this year’s new accessibility features for iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro. The year’s features come as Apple celebrates 40 years of accessibility innovation, with the company originally starting its office of disability in 1985. “At Apple, accessibility is part of our DNA,” Apple CEO Tim Cook said. “Making technology for everyone is a priority for all of us, and we’re proud of the innovations we’re sharing this year. That includes tools to help people access crucial information, explore the world around them, and do what they love.” These features are destined for Apple’s forthcoming iOS 19 and macOS 16 updates, which will be announced at WWDC next month. Apple, however, does not specifically refer to ‘iOS 19’ prematurely. This year’s new accessibility features include Accessibility Nutrition Labels on the App Store, a new Magnifier app for Mac, major updates to Apple’s Personal Voice feature, and more. Accessibility Nutrition Labels on the App Store Headlining Apple’s announcement of new iOS 19 accessibility features is an upgrade for the App Store. Later this year, Apple will add a new Accessibility Nutrition Labels section to App Store listings. This is similar to the Privacy Nutrition Labels feature that Apple launched several years ago, giving users a way to easily see the privacy practices of individual apps. The new Accessibility Nutrition Labels will show users which accessibility features an app supports prior to downloading it. This includes VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduced Motion, captions, and more. Eliel Johnson, Vice President of User Experience and Design at CVS Health, praised the addition of Accessibility Nutrition Labels in a statement to 9to5Mac: “At CVS Health we are passionate about making health care simpler and more accessible. We want to make it as easy as possible for consumers to find the health care information they need. By supporting Apple’s new accessibility nutrition labels, we’re providing more transparency and elevating the work our CVS Health teams do to create great experiences for all consumers.” Apple says it will share more details on Accessibility Nutrition Labels for developers later this year. Magnifier for Mac In conjunction with iOS 19, Apple is bringing its Magnifier app to the Mac for the first time this year with macOS 16. The Magnifier app has been available on iPhone and iPad since 2016 as a way to give users who are blind or have low vision the ability to zoom in, read text, and detect objects around them. The new Magnifier app for Mac works with an iPhone connected via Continuity Camera or an attached USB camera. Once connected, users can zoom in on their surroundings and in the Magnifier app on their Mac. They can also manipulate the video feed to adjust things like perspective, brightness, contrast, colors, and more to make it easier to read. The Magnifier app can even recognize text. For example, the Magnifier app on Mac can be used to zoom in on a whiteboard in a meeting or lecture. Then, the app can intelligently recognize the handwritten text on that whiteboard and make it more legible and easier to read directly on the user’s Mac. Accessibility Reader Accessibility Reader is a new feature that will be available system-wide to make text easier to read. Apple explains: Accessibility Reader is a new systemwide reading mode designed to make text easier to read for users with a wide range of disabilities, such as dyslexia or low vision. Available on iPhone, iPad, Mac, and Apple Vision Pro, Accessibility Reader gives users new ways to customize text and focus on content they want to read, with extensive options for font, color, and spacing, as well as support for Spoken Content. Accessibility Reader can be launched from any app, and is built into the Magnifier app for iOS, iPadOS, and macOS, so users can interact with text in the real world, like in books or on dining menus. Braille Access Apple touts that its new Braille Access experience can turn a user’s Apple device “into a full-featured braille note taker that’s deeply integrated into the Apple ecosystem.” “With a built-in app launcher, users can easily open any app by typing with Braille Screen Input or a connected braille device,” Apple explains. “With Braille Access, users can quickly take notes in braille format and perform calculations using Nemeth Braille, a braille code often used in classrooms for math and science.” Additionally, Apple says that users can open Braille Ready Format (BRF) files directly from Braille Access. This will unlock a “wide range of books and files previously created on a braille note taking device.” The feature also ties in with Apple’s powerful Live Captions feature to allow users to transcribe conversations in real-time on braille displays. Live Captions on Apple Watch Speaking of Live Captions, watchOS 11 coming later this year will bring Live Listen controls to Apple Watch for the first time. As a refresher, Live Listen first came to the iPhone with iOS 12 in 2018. The feature uses an iPhone’s microphone to stream content directly to AirPods and Made for iPhone hearing aids so it’s easier for a user to hear. With watchOS 11 this year, Live Listen controls will now be available on Apple Watch, including the ability to remotely start or stop Live Listen sessions, jump back in a session to catch something that might have been missed, and more. There’s also support for real-time Live Captions, allowing users to follow along with the conversation via live transcripts directly on their Apple Watch. Vision Pro upgrades Apple also touts new accessibility features coming to Apple Vision Pro this year: For users who are blind or have low vision, visionOS will expand vision accessibility features using the advanced camera system on Apple Vision Pro. With powerful updates to Zoom, users can magnify everything in view — including their surroundings — using the main camera. For VoiceOver users, Live Recognition in visionOS uses on-device machine learning to describe surroundings, find objects, read documents, and more. For accessibility developers, a new API will enable approved apps to access the main camera to provide live, person-to-person assistance for visual interpretation in apps like Be My Eyes, giving users more ways to understand their surroundings hands-free. Personal Voice upgrades Apple debuted its revolutionary Personal Voice feature as part of iOS 17 in 2023. The feature is a way for people who are at risk of losing their ability to speak to create and save a voice that sounds like them. Apple says it’s designed for people at risk of losing their ability to speak, such as those with a recent diagnosis of ALS. The feature then ties into Live Speech, which allows users to type what they want to say and have it be spoken out in their voice. The setup process for the initial version of Personal Voice required that users say 150 different phrases to train Apple’s machine learning model. The voice was then processed overnight. With iOS 19 this year, however, Apple has completely revamped the setup process. Now, users will only need to record 10 different phrases and process in under a minute rather than in multiple hours overnight. The end result is a voice that is “smoother” and “more natural-sounding,” according to Apple. The Personal Voice feature will also add support for Spanish (Mexico) this year, Apple says. More accessibility features coming this year Those features are just the tip of the iceberg. Apple has a long list of other new capabilities coming to its platforms later this year, including upgrades to Eye Tracking, Background Sounds, Sound Recognition, and much more. Background Sounds becomes easier to personalize with new EQ settings, the option to stop automatically after a period of time, and new actions for automations in Shortcuts. Background Sounds can help minimize distractions to increase a sense of focus and relaxation, which some users find can help with symptoms of tinnitus. For users at risk of losing their ability to speak, Personal Voice becomes faster, easier, and more powerful than ever, leveraging advances in on-device machine learning and artificial intelligence to create a smoother, more natural-sounding voice in less than a minute, using only 10 recorded phrases. Personal Voice will also add support for Spanish (Mexico). Vehicle Motion Cues, which can help reduce motion sickness when riding in a moving vehicle, comes to Mac, along with new ways to customize the animated onscreen dots on iPhone, iPad, and Mac. Eye Tracking users on iPhone and iPad will now have the option to use a switch or dwell to make selections. Keyboard typing when using Eye Tracking or Switch Control is now easier on iPhone, iPad, and Apple Vision Pro with improvements including a new keyboard dwell timer, reduced steps when typing with switches, and enabling QuickPath for iPhone and Vision Pro.  With Head Tracking, users will be able to more easily control iPhone and iPad with head movements, similar to Eye Tracking. For users with severe mobility disabilities, iOS, iPadOS, and visionOS will add a new protocol to support Switch Control for Brain Computer Interfaces (BCIs), an emerging technology that allows users to control their device without physical movement.  Assistive Access adds a new custom Apple TV app with a simplified media player. Developers will also get support in creating tailored experiences for users with intellectual and developmental disabilities using the Assistive Access API. Music Haptics on iPhone becomes more customizable with the option to experience haptics for a whole song or for vocals only, as well as the option to adjust the overall intensity of taps, textures, and vibrations. Sound Recognition adds Name Recognition, a new way for users who are deaf or hard of hearing to know when their name is being called. Voice Control introduces a new programming mode in Xcode for software developers with limited mobility. Voice Control also adds vocabulary syncing across devices, and will expand language support to include Korean, Arabic (Saudi Arabia), Turkish, Italian, Spanish (Latin America), Mandarin Chinese (Taiwan), English (Singapore), and Russian.  Live Captions adds support to include English (India, Australia, UK, Singapore), Mandarin Chinese (Mainland China), Cantonese (Mainland China, Hong Kong), Spanish (Latin America, Spain), French (France, Canada), Japanese, German (Germany), and Korean.  Updates to CarPlay include support for Large Text. With updates to Sound Recognition in CarPlay, drivers or passengers who are deaf or hard of hearing can now be notified of the sound of a crying baby, in addition to sounds outside the car such as horns and sirens.  Share Accessibility Settings is a new way for users to quickly and temporarily share their accessibility settings with another iPhone or iPad. This is great for borrowing a friend’s device or using a public kiosk in a setting like a cafe. You can find Apple’s full rundown of its new accessibility features coming this year on its website. My favorite iPhone accessories: Follow Chance: Threads, Bluesky, Instagram, and Mastodon.  Add 9to5Mac to your Google News feed.  FTC: We use income earning auto affiliate links. More.You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel Source: https://9to5mac.com/2025/05/13/apple-unveils-ios-19-accessibility-features/ #apple #unveils #ios #accessibility #features #magnifier #mac #app #store #labels
9TO5MAC.COM
Apple unveils iOS 19 accessibility features: Magnifier for Mac, App Store labels, more
Ahead of WWDC kicking off in June, Apple today has officially unveiled this year’s new accessibility features for iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro. The year’s features come as Apple celebrates 40 years of accessibility innovation, with the company originally starting its office of disability in 1985. “At Apple, accessibility is part of our DNA,” Apple CEO Tim Cook said. “Making technology for everyone is a priority for all of us, and we’re proud of the innovations we’re sharing this year. That includes tools to help people access crucial information, explore the world around them, and do what they love.” These features are destined for Apple’s forthcoming iOS 19 and macOS 16 updates, which will be announced at WWDC next month. Apple, however, does not specifically refer to ‘iOS 19’ prematurely. This year’s new accessibility features include Accessibility Nutrition Labels on the App Store, a new Magnifier app for Mac, major updates to Apple’s Personal Voice feature, and more. Accessibility Nutrition Labels on the App Store Headlining Apple’s announcement of new iOS 19 accessibility features is an upgrade for the App Store. Later this year, Apple will add a new Accessibility Nutrition Labels section to App Store listings. This is similar to the Privacy Nutrition Labels feature that Apple launched several years ago, giving users a way to easily see the privacy practices of individual apps. The new Accessibility Nutrition Labels will show users which accessibility features an app supports prior to downloading it. This includes VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduced Motion, captions, and more. Eliel Johnson, Vice President of User Experience and Design at CVS Health, praised the addition of Accessibility Nutrition Labels in a statement to 9to5Mac: “At CVS Health we are passionate about making health care simpler and more accessible. We want to make it as easy as possible for consumers to find the health care information they need. By supporting Apple’s new accessibility nutrition labels, we’re providing more transparency and elevating the work our CVS Health teams do to create great experiences for all consumers.” Apple says it will share more details on Accessibility Nutrition Labels for developers later this year. Magnifier for Mac In conjunction with iOS 19, Apple is bringing its Magnifier app to the Mac for the first time this year with macOS 16. The Magnifier app has been available on iPhone and iPad since 2016 as a way to give users who are blind or have low vision the ability to zoom in, read text, and detect objects around them. The new Magnifier app for Mac works with an iPhone connected via Continuity Camera or an attached USB camera. Once connected, users can zoom in on their surroundings and in the Magnifier app on their Mac. They can also manipulate the video feed to adjust things like perspective, brightness, contrast, colors, and more to make it easier to read. The Magnifier app can even recognize text. For example, the Magnifier app on Mac can be used to zoom in on a whiteboard in a meeting or lecture. Then, the app can intelligently recognize the handwritten text on that whiteboard and make it more legible and easier to read directly on the user’s Mac. Accessibility Reader Accessibility Reader is a new feature that will be available system-wide to make text easier to read. Apple explains: Accessibility Reader is a new systemwide reading mode designed to make text easier to read for users with a wide range of disabilities, such as dyslexia or low vision. Available on iPhone, iPad, Mac, and Apple Vision Pro, Accessibility Reader gives users new ways to customize text and focus on content they want to read, with extensive options for font, color, and spacing, as well as support for Spoken Content. Accessibility Reader can be launched from any app, and is built into the Magnifier app for iOS, iPadOS, and macOS, so users can interact with text in the real world, like in books or on dining menus. Braille Access Apple touts that its new Braille Access experience can turn a user’s Apple device “into a full-featured braille note taker that’s deeply integrated into the Apple ecosystem.” “With a built-in app launcher, users can easily open any app by typing with Braille Screen Input or a connected braille device,” Apple explains. “With Braille Access, users can quickly take notes in braille format and perform calculations using Nemeth Braille, a braille code often used in classrooms for math and science.” Additionally, Apple says that users can open Braille Ready Format (BRF) files directly from Braille Access. This will unlock a “wide range of books and files previously created on a braille note taking device.” The feature also ties in with Apple’s powerful Live Captions feature to allow users to transcribe conversations in real-time on braille displays. Live Captions on Apple Watch Speaking of Live Captions, watchOS 11 coming later this year will bring Live Listen controls to Apple Watch for the first time. As a refresher, Live Listen first came to the iPhone with iOS 12 in 2018. The feature uses an iPhone’s microphone to stream content directly to AirPods and Made for iPhone hearing aids so it’s easier for a user to hear. With watchOS 11 this year, Live Listen controls will now be available on Apple Watch, including the ability to remotely start or stop Live Listen sessions, jump back in a session to catch something that might have been missed, and more. There’s also support for real-time Live Captions, allowing users to follow along with the conversation via live transcripts directly on their Apple Watch. Vision Pro upgrades Apple also touts new accessibility features coming to Apple Vision Pro this year: For users who are blind or have low vision, visionOS will expand vision accessibility features using the advanced camera system on Apple Vision Pro. With powerful updates to Zoom, users can magnify everything in view — including their surroundings — using the main camera. For VoiceOver users, Live Recognition in visionOS uses on-device machine learning to describe surroundings, find objects, read documents, and more. For accessibility developers, a new API will enable approved apps to access the main camera to provide live, person-to-person assistance for visual interpretation in apps like Be My Eyes, giving users more ways to understand their surroundings hands-free. Personal Voice upgrades Apple debuted its revolutionary Personal Voice feature as part of iOS 17 in 2023. The feature is a way for people who are at risk of losing their ability to speak to create and save a voice that sounds like them. Apple says it’s designed for people at risk of losing their ability to speak, such as those with a recent diagnosis of ALS. The feature then ties into Live Speech, which allows users to type what they want to say and have it be spoken out in their voice. The setup process for the initial version of Personal Voice required that users say 150 different phrases to train Apple’s machine learning model. The voice was then processed overnight. With iOS 19 this year, however, Apple has completely revamped the setup process. Now, users will only need to record 10 different phrases and process in under a minute rather than in multiple hours overnight. The end result is a voice that is “smoother” and “more natural-sounding,” according to Apple. The Personal Voice feature will also add support for Spanish (Mexico) this year, Apple says. More accessibility features coming this year Those features are just the tip of the iceberg. Apple has a long list of other new capabilities coming to its platforms later this year, including upgrades to Eye Tracking, Background Sounds, Sound Recognition, and much more. Background Sounds becomes easier to personalize with new EQ settings, the option to stop automatically after a period of time, and new actions for automations in Shortcuts. Background Sounds can help minimize distractions to increase a sense of focus and relaxation, which some users find can help with symptoms of tinnitus. For users at risk of losing their ability to speak, Personal Voice becomes faster, easier, and more powerful than ever, leveraging advances in on-device machine learning and artificial intelligence to create a smoother, more natural-sounding voice in less than a minute, using only 10 recorded phrases. Personal Voice will also add support for Spanish (Mexico). Vehicle Motion Cues, which can help reduce motion sickness when riding in a moving vehicle, comes to Mac, along with new ways to customize the animated onscreen dots on iPhone, iPad, and Mac. Eye Tracking users on iPhone and iPad will now have the option to use a switch or dwell to make selections. Keyboard typing when using Eye Tracking or Switch Control is now easier on iPhone, iPad, and Apple Vision Pro with improvements including a new keyboard dwell timer, reduced steps when typing with switches, and enabling QuickPath for iPhone and Vision Pro.  With Head Tracking, users will be able to more easily control iPhone and iPad with head movements, similar to Eye Tracking. For users with severe mobility disabilities, iOS, iPadOS, and visionOS will add a new protocol to support Switch Control for Brain Computer Interfaces (BCIs), an emerging technology that allows users to control their device without physical movement.  Assistive Access adds a new custom Apple TV app with a simplified media player. Developers will also get support in creating tailored experiences for users with intellectual and developmental disabilities using the Assistive Access API. Music Haptics on iPhone becomes more customizable with the option to experience haptics for a whole song or for vocals only, as well as the option to adjust the overall intensity of taps, textures, and vibrations. Sound Recognition adds Name Recognition, a new way for users who are deaf or hard of hearing to know when their name is being called. Voice Control introduces a new programming mode in Xcode for software developers with limited mobility. Voice Control also adds vocabulary syncing across devices, and will expand language support to include Korean, Arabic (Saudi Arabia), Turkish, Italian, Spanish (Latin America), Mandarin Chinese (Taiwan), English (Singapore), and Russian.  Live Captions adds support to include English (India, Australia, UK, Singapore), Mandarin Chinese (Mainland China), Cantonese (Mainland China, Hong Kong), Spanish (Latin America, Spain), French (France, Canada), Japanese, German (Germany), and Korean.  Updates to CarPlay include support for Large Text. With updates to Sound Recognition in CarPlay, drivers or passengers who are deaf or hard of hearing can now be notified of the sound of a crying baby, in addition to sounds outside the car such as horns and sirens.  Share Accessibility Settings is a new way for users to quickly and temporarily share their accessibility settings with another iPhone or iPad. This is great for borrowing a friend’s device or using a public kiosk in a setting like a cafe. You can find Apple’s full rundown of its new accessibility features coming this year on its website. My favorite iPhone accessories: Follow Chance: Threads, Bluesky, Instagram, and Mastodon.  Add 9to5Mac to your Google News feed.  FTC: We use income earning auto affiliate links. More.You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel
·66 Views