From the checked devices with the only thought to the possibility of having Braille on the screen. It is the revolution announced in recent days by Apple and which will arrive by the end of the year. “Accessibility is part of Apple’s DNA,” said Tim Cook, Apple CEO.
“Creating technologies that work for all people is a priority for us – said Cook – and the innovations that we will make available this year fill us with pride. We will offer tools to help people access fundamental information, explore the world around them and do everything they love”. A forty -year commitment, remembered by Sarah Herrlinger of Apple, who thanks to the power and integration of the ecosystem transforms these functions into fluid and natural experiences.
The App Store dresses into inclusion
The first big news touches the beating heart of the ecosystem: the App Store. The “Effective cards” arrive, dedicated sections in the pages of apps and games that will clearly signal the supported functions (voiceover, vocal control, larger text, etc.). A crucial step forward, enthusiastically welcomed by the American Foundation for the Blind: users will immediately know if an app does for them, and developers will have a way to enhance their work on accessibility.
New ways of seeing and interacting with the world
For those who have vision problems, the awaited lens app for Mac arrives, bringing the function already known on the iPhone/iPad to enlarge and interact with the real world via the camera on the desktop. It allows you to read documents (also with panoramic desk), customize the view and integrates with the new Accessibility Reader. The latter is a method of reading at the system level, available everywhere (iPhone, iPad, Mac, Vision Pro), which customizes texts and supports reading aloud, making reading easier for dyslexic or visually impaired.
Braille also lands on the screen
One of the most significant innovations is Braille Access: a new integrated experience that transforms Apple devices (including vision pro) into complete tools to take notes, perform calculations (also Nemeth code) and interact directly in Braille, with the possibility of reading Brf files and integrating live transcriptions.
Transcstitions on the wrist and bionic eyes for vision pro
For deaf and hypothesis, live listening controls arrive on Apple Watch, showing real -time transcriptions directly to the wrist and allowing you to use the clock as a remote control. Meanwhile, Vision Pro uses its system of cameras to enhance the functions for blind/partially sighted: Zoom allows you to enlarge the surrounding environment and recognition in real time (with the Ai On-Device) describes the environment, finds objects and laws. A new API will allow apps like Be My Eyes to offer visual assistance in real -time hands.
From the control “with thought” to the recognition of the name: a myriad of targeted aid
The list of updates does not end here and touches an incredible variety of needs. It starts from the control, Orio that introduces new options for eye tracing and head tracking to control devices with head/eyes movements. But above all, the support for Brain Computer Interfaces (BCI) arrives through switches control, allowing those with serious motor disabilities to interact with the device directly with the thought. It also improves typing with eye tracing and switches control.
Also more customizable background sounds arrive. Fastest and natural personal voice (also in Mexico Spanish). Movement indicators in vehicles (now also on Mac) to reduce nausea. Feedback aptics more customizable music. Recognition sounds adds the recognition of the name pronounced for deaf/hypothesis. It improves vocal control with programming mode and more languages (including Italian). Languages also expanded for live transcriptions. News arriving also for Carplay (bigger text, sound recognition). Assisted access arrives on Apple TV and gets bees for developers dedicated to intellectual disabilities.
Share Accessability Settings also arrives to temporarily share your accessibility settings, very useful on devices borrowed or public workstations.
Thanks to the power of the Apple chip and the evolution of Machine Learning e
Artificial intelligence, the Cupertino company once again shows that the most advanced technology is the one that knows how to put itself at the service of everyone, making innovation not a privilege, but an accessible right.