The iPhone and the iPad? They control themselves with their gaze. And Apple launches the motion sickness feature

It will be a real revolution announced by Apple for World Accessibility Day and available by the end of 2024 presumably with the iOS 18 software update: on iPhone and iPad it will be possible …

The iPhone and the iPad?  They control themselves with their gaze.  And Apple launches the motion sickness feature

It will be a real revolution announced by Apple for World Accessibility Day and available by the end of 2024 presumably with the iOS 18 software update: on iPhone and iPad it will be possible to scroll through pages, the screen, apps and see photos simply using the eyes. Music will also be accessible to deaf people and important innovations are also coming for those who drive by car. These enormous improvements are designed within theaccessibility for users with physical disabilities.

What is Eye Tracking

Literally “eye tracking”, the function Eye Tracking it will be based on artificial intelligence with the possibility of controlling iPad and iPhone using only the eyes: the Cupertino experts explain that all this will be possible thanks to the use of the front camera which will allow configuration and calibration in a few seconds and simple steps. Privacy is protected because with “machine learning on-device”, all the data used to configure and control this function will remain safe on the device and will never be sent to Apple. But how will the eyes flip through the pages and so on? You will have to look at some dots that will appear on the screen at different points: then, with “assisted control” the movements of the head and eyes will be combined for each action. Even if all this was designed and created for those with mobility problems, this does not mean that it can not be used by anyone.

What is Music Haptics

Music Haptics will give deaf or hard of hearing people the opportunity to “hear” music on the iPhone in a completely new way: with the function active, the device will be able to reproduce beats, sound textures and the subtle vibrations of a song thanks to atypical feedback. This will make it possible to listen to millions of songs on Apple Music and available as an API with developers able to make music more accessible in their apps. Those who have pronunciation problems, however, will have the “Vocal Shortcuts” function available which allows “to assign custom phrases to commands that Siri can understand to activate shortcuts and complete complex tasks”, the developers explain. Again, behind the scenes there will be artificial intelligence capable of creating vocal models on the sounds spoken by the person.

What happens against motion sickness

With Vehicle Motion Cues, literally “Vehicle movement signals”, it will be possible to limit the sensations of discomfort caused by the movement of cars, motorbikes, buses, coaches, etc. called motion sickness and caused by “a sensory conflict between what you see and how you feel, and this disorder may prevent some people from using iPhone or iPad while traveling in a vehicle“. With the function active, some animated points will appear on the device screens capable of reproducing variations in the vehicle's movement and reducing sensory conflict without interfering with the main content.

More and new functions

Here then also CarPlay which provides Voice Control, Sound Recognition and Color Filters: in the first case the apps can be managed only by voice; in the second, deaf or hard of hearing people who drive or travel by car can activate notifications for sounds such as horns and sirens while Color Filters will help people with color blindness to better visualize the CarPlay interface.

Another new feature is the “Listen for Atypical Speech” (“Listen to atypical speech”), “which gives the user the ability to improve speech recognition for a wider range of voices, using on-device machine learning to recognize speech patterns,” explains Apple. The development of this function was designed for people with speech problems and those affected by cerebral palsy, amyotrophic lateral sclerosis (ALS) or stroke. Another important innovation is represented by Live transcriptionswhich will help any user, including deaf or hard of hearing users, to participate in a live conversation or in the audio content of the apps.

The event in Milan

The innovations were presented on the occasion of World Accessibility Day: ad Apple Liberty Squarein Milan, on the evening of Thursday 16 May the minds who conceived and produced the “Assume that I can” campaign took part: Martina Fuga of CoorDown, Luca Lorenzini and Luca Pannese of SMALL (with pre-recorded video), and Karim Bartoletti of Indiana Production.

The association has made a commitment to focus attention on people with Down Syndrome, paying particular attention not to speak for them).