As always, some cool stuff happening.
Apple is releasing a bunch of new accessibility features for iPads. My favorite will be becoming later this year: eye tracking. This is where the device can use the front facing camera to tell where you’re looking on the screen and you can then Use that to do taps and other gestures. I have this on a laptop, but I hardly ever use a laptop, so having it on a tablet (which is on a holder on my wheelchair) will be super nice.
Eye Tracking Comes to iPad and iPhone
.
Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.
.
Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.
More Apple announcements for GAAD