Later this year, Apple will introduce new accessibility enhancements, including live captions, upgraded braille reader support in the App Store.
Apple has unveiled a wide array of new iOS accessibility features designed to support users with vision and hearing impairments, emphasising that accessibility should not be considered a premium offering tied to the cost of its hardware.

Ahead of Global Accessibility Awareness Day on Thursday, 15 May, Apple announced on Wednesday a suite of new iOS accessibility features set to roll out later this year. These include live captions, personal voice replication, enhanced reading tools, braille reader improvements, and accessibility “nutrition labels” in the App Store.
These labels will require developers to specify the accessibility features included in their apps, such as VoiceOver, Voice Control, or extensive text options.
Apple has enhanced its Magnifier app, extending support to Mac and enabling users to zoom in on screens or whiteboards using the camera or a connected iPhone. The update includes adjustable brightness, contrast, colours, and other settings designed to improve the readability of texts and images during presentations and lectures.
The latest braille features introduce note-taking capabilities using a braille screen input or a compatible braille device. They also support calculations with Nemeth braille, the standardised braille code for mathematical and scientific notation.

The new personal voice feature can replicate a user’s voice using just 10 phrases, a significant improvement over the previous model, which required 150 phrases and an overnight processing period. Apple has stated that the voice replication will be password-protected and stored on the device, unless backed up to iCloud, where it will be encrypted to minimise the risk of unauthorised use.
Apple’s senior director of global accessibility policy and initiatives, Sarah Herrlinger, stated that as Apple advances in artificial intelligence, the accessibility team is exploring ways to incorporate these innovations into their work.
“We’ve collaborated closely with our AI team for years on various features. As new opportunities emerge, we remain committed to staying at the forefront of innovation to develop the best possible technology,” she said.
The company’s Live Listen feature enhances audio quality through AirPods, making it easier for users to hear in environments like lecture theatres. Alongside the latest update introducing live captions, Apple recently launched a feature that enables people with hearing loss to use AirPods as hearing aids.

Apple’s devices are often priced at a premium in the smartphone market. Still, Herrlinger dismissed the notion that accessibility features come at an additional cost, emphasising that Apple integrates them directly into its operating system at no charge.
“It comes built into the device at no extra cost,” she said.
“We aim to provide diverse accessibility features, recognising that each individual’s experience is unique. Different users may rely on various accessibility tools, whether to assist with a single disability or multiple needs.”
Apple’s latest accessibility advancements highlight the company’s ongoing commitment to inclusive technology. By integrating features such as live captions, braille support, and AI-driven voice replication, Apple ensures that its devices remain accessible to users of all abilities.