Apple Announces New Accessibility Features Like Live Captions and Door Detection

Apple’s gadgets already include some accessibility features that make it easier for physically challenged individuals to use them. 

However, ahead of tomorrow’s Global Accessibility Awareness Day, Apple has revealed a slew of new accessibility capabilities for iPhone, iPad, and Apple Watch users. Take a look at the information below.

New Accessibility Features from Apple

In an official blog post, Apple just announced the new accessibility features for iPhones, iPads, and Apple Watch. 

More object recognition choices, such as locked doors in the Magnifier tool, a new Apple Watch Mirroring function, and a new Live Caption capability for individuals who are deaf or have hearing impairments are among the new features.

Live Captions

Users with hearing problems or entirely deaf will benefit from the Live Captions function, which will be available soon. 

Google first offered this function with Android 10, and Apple has now caught up with the Mountain View behemoth with the latest upgrade. 

Microsoft is also competing, having just included Live Captions to Windows 11.

On iPhones, iPads, and Macs, the Live Captions feature allows users to switch on live subtitles for audio material, including FaceTime or video calls (header picture), videos playing on a device, and even one-on-one discussions. 

Additionally, while utilizing Live Captions for calls on a Mac, users may enter an answer and have it read aloud to the call’s participants right away.

Door Detection

The Door Detections feature enhances the Magnifier function on iPhones and iPads that allows users to detect a closed or open door ahead of them on their route. 

It detects the door and describes its properties to customers using the LiDAR sensor on compatible iPhone and iPad devices.

This function determines if a door is open or closed. It detects if you can open it by pushing, tugging, or turning a knob if it is closed. 

It also reads aloud to the user the words, signs, and symbols such as room numbers on doors. The Magnifier’s current Image Description and People Detection features may be combined with Door Detection.

However, since it relies on the LiDAR sensor, the function is only available on iPhone and iPad devices that include the sensor.

Apple Watch Mirroring

The next feature is the Apple Watch Mirroring function, which enables individuals with physical or motor limitations to utilize their Apple Watch with their linked iPhone. 

The functionality takes advantage of Apple’s AirPlay technology and enables users to manage the Apple Watch remotely using iPhone capabilities like Voice Control and Switch Control.

In addition to touching the Apple Watch display, it may employ voice commands, sound actions, head tracking, and third-party MFi-certified switches as inputs.

Apart from that, Apple’s VoiceOver function now supports 20 more languages. Bengali, Bulgarian, Ukrainian, Catalan, and Vietnamese are among these languages. 

Other accessibility features such as Speak Screen and Speak Selection will also support these additional languages.

Additionally, Apple has indicate that Siri, sound recognition, and other accessibility capabilities will include. 

A new Buddy Controller feature allows users to link two gaming controllers to create a single controller. Its purpose is to enable disabled people to enjoy games on compatible Apple devices with the aid of their friends and family.

So, what are your thoughts on these new iPhone, iPad, Mac, and Apple Watch accessibility features? Leave your opinions in the comments section below, and stay tuned for more insightful new articles.