Apple will soon let iPad and iPhone users control their devices with a glance of their eyes.

A new AI-powered feature called Eye Tracking, unveiled Wednesday, is designed for people with physical disabilities. Apple said it’s coming later this year.

The feature uses the front-facing camera to set up and calibrate, though Apple notes it doesn’t access or share any data.

Users can navigate with Dwell Control, which works by monitoring how long the eyes stay trained on various controls. Users can access physical buttons and complete swipes and other gestures with their eyes.

“For nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software,” CEO Tim Cook said in a statement.

Eye Tracking will be included in iOS and iPadOS, with no additional hardware required, Apple said.

The tech giant rolled out a suite of changes ahead of Global Accessibility Awareness Day on Thursday.

Other updates include Music Haptics, a new music experience for deaf or hard-of-hearing users in which taps and other vibrations are played along with music audio.

And two new features are designed for people with speech conditions: Vocal Shortcuts enables iPhone and iPad users to launch Siri shortcuts using custom phrases, while a Listen for Atypical Speech option can recognize a wider range of speech patterns.

Another cool feature may help motion sickness-plagued travelers.

Vehicle Motion Cues places moving black dots on the edge of device screens to denote which way a vehicle is moving. This helps motion sickness by assuaging the “sensory conflict between what a person sees and what they feel,” Apple said.

Share.
Exit mobile version