Apple announced new accessibility features coming later this year. This includes Eye Tracking, through which users with physical disabilities can control an iPad or iPhone with their eyes.
Navigate and Control Your Apple Devices with Apple's Eye Tracking Feature
This tool will use the front camera and AI to track eye movement and facilitate navigation on the devices.
Designed for users with physical disabilities, this feature uses the front-facing camera to set up and calibrate in seconds, and machine learning helps in navigating through apps. All data used to set up this feature is kept safely on respective devices, and isn’t shared with Apple.
This feature works well in iPadOS and iOS apps, and doesn’t require any additional hardware or accessories.
Users can access additional functions like physical buttons, swipes, and other gestures by using Dwell Control to activate each element and navigate through the elements of an app solely with their eyes.
“We believe deeply in the transformative power of innovation to enrich lives. That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users”, Apple’s CEO Tim Cook said.
You might also be interested in - Apple likely to launch first foldable iPhone in 2026