Apple adds new tools to the iPhone and iPad that will help people with disabilities.

Apple is adding new accessibility capabilities to iPads and iPhones to meet a wide variety of user demands. These include the capacity to use a haptic engine to enjoy music, make bespoke shortcuts with your voice, and operate your smartphone using eye-tracking technology. The firm made these statements on Thursday, just before Global Accessibility Awareness Day.

Apple had previously included eye-tracking in iOS and iPadOS, but it required additional eye-tracking equipment. Apple has never before made an iPad or iPhone controllable without the need for additional hardware or accessories. Users of the front-facing camera may now browse through applications with the new integrated eye-tracking feature. Using artificial intelligence, it determines what the user is looking at and what gesture—like tapping or swiping—they wish to make. Additionally, there is Dwell Control, a function that can detect when someone is staring at a piece and wants to choose it. 

An additional new and helpful feature is “Vocal Shortcuts,” which enhances Apple’s voice-based capabilities. It allows users to designate various phrases or sounds to start shortcuts and finish jobs. For example, even with a simple “Ah!” the user may tell Siri to open an app. Additionally, the firm created “Listen for Atypical Speech,” a machine learning-based programme intended for people with speech-impaired disorders such as stroke, amyotrophic lateral sclerosis (ALS), and cerebral palsy. 

Image Credits: Apple

Among the previous speech enhancements from Apple is “Personal Voice,” which debuted last year and offers customers an artificial voice that mimics their own.

“Music Haptics” is a brand-new feature that allows deaf or hard-of-hearing individuals to enjoy millions of songs on Apple Music through a sequence of taps, textures, and vibrations. Developers of music apps will soon be able to offer consumers a novel and easily accessible audio experience, thanks to its availability as an API.

Apple also unveiled a new function to alleviate motion sickness in vehicles. Users can prevent motion sickness from staring at static objects by activating the “Vehicle Motion Cues” option. With this feature, the edges of the screen are animated dots that sway and move in the direction of the action.

Image Credits: Apple

Additionally receiving an upgrade is CarPlay, which now has “Voice Control,” “Colour Filters,” which provide colorblind users with bolder, bigger lettering, and “Sound Recognition,” which alerts deaf or hard-of-hearing users to approaching automobile horns and sirens.

Apple also unveiled a VisionOS accessibility feature that would allow live captioning during FaceTime chats. 

Juliet P.
Author: Juliet P.

Share this article
0
Share
Shareable URL
Prev Post

A co-founder of Instagram is appointed director of product at Anthropic.

Next Post

Wear OS 5 enters developer preview with enhanced battery life.

Leave a Reply

Your email address will not be published. Required fields are marked *

Read next
Subscribe to our newsletter
Get notified of the best deals on our WordPress themes.