Latest Software to Include Advancements Enabling Users with Disabilities
Apple announced upcoming releases of iOS, iPadOS, and watchOS will incorporate powerful AI-driven accessibility features to make using Apple devices easier for those with vision, mobility, hearing, and cognitive disabilities.
Headlining the slate of enhancements is Door Detection for iPhone and iPad users with visual impairments. Leveraging LiDAR scanning and advanced computer vision, it alerts users to the presence of doors and reads signs and symbols nearby.
Live Captions Aid Hard of Hearing Users
For the hearing impaired, Apple is introducing Live Captions which can automatically transcribe conversations in real time to display on screen. The text will be generated on-device using neural machine translation without needing an internet connection.
Similarly, Apple Watch mirrors iPhone alerts as haptic feedback vibrations, discreetly notifying users of incoming calls and messages through coded pulse patterns.
AssistiveTouch Enables Touch-Free Control
AssistiveTouch aids users with limited mobility by enabling hands-free navigation of Apple Watch through hand gestures alone. Machine learning allows the watch to recognize pinch and clench gestures. AI helps classify muscle signals detected by sensors.
This allows actions like controlling a cursor, responding to notifications, and much more. Apple stated the innovations aim to “push the boundaries of innovation in accessibility.”
Inclusive Design Ethos Drives Apple’s Access Features
Apple engineers and designers have long embraced making products usable for all as a core philosophy. The latest assistive features continue Apple’s drive to harness AI to create more accessible and empowering user experiences.
Looking ahead, Apple plans to build upon capabilities like SignTime which connects American Sign Language users with translators. The goal remains seamlessly integrating accessibility into all devices to benefit every user.
Add Comment