When capacitive touch screens and multi-touch gestures first hit the mobile computing scene, they were a technological revelation. While existing point-and-click solutions work well enough in their own right, manipulating an on-screen cursor adds, by its very nature, a layer of abstraction to human-computer interactions. Direct finger-based manipulation, by contrast, is so intuitive that even a toddler can get the hang of it within just a few minutes.
Of course, now that touch screen controls have become ubiquitous, it’s only natural to look towards the ‘next big thing’ in human interface design. Many are convinced that the logical next-step forward is hands-free manipulation of windowed apps and user interface elements, and variants of this paradigm have already made their way into certain virtual reality (VR) and augmented reality (AR) glasses and headsets.
Hands-free interactions between humans and computers encompass everything from eye, head, and finger tracking, all the way to Neuralink with its implantable brain-computer interface (BCI) technologies. Believe it or not, even the iPhone sitting in your pocket has a built-in eye control solution of its own, and it’s called Eye Tracking.
Built-in Eye Tracking is available on iPhone 12 series models and newer (as well as on the third generation iPhone SE), and it requires iOS 18 or above. On compatible iPad models, the feature requires iPadOS 18 or above. You can also use dedicated Made for iPhone (MFi) eye tracking hardware on the iPad.
How to use Eye Tracking on your iPhone
The feature is buried inside accessibility settings
Assuming you have a compatible iPhone model running iOS 18 or newer, you can toggle on and configure Eye Tracking by following these steps:
- Launch the Settings application.
- Navigate to Accessibility > Physical and Motor > Eye Tracking.
- Flip the Eye Tracking switch to its on position.
Once toggled on, you’ll be prompted to complete a quick calibration wizard. This involves tilting your head up and down, as well as following dots with your eyes as they move around the screen. After about thirty seconds of this, Eye Tracking will be ready to use.
There are a few additional settings that can be configured to your liking once tracking itself is fully set up. These include:
- A Smoothing slider for adjusting the pointer’s responsiveness
- A Snap to Item toggle for allowing the cursor to latch onto nearby on-screen items and controls
- A Zoom on Keyboard Keys toggle for magnifying the on-screen keyboard when it’s being actively glanced at
- An Auto-Hide toggle for showing and hiding the on-screen cursor based on a specified duration
- A Dwell Control toggle for adjusting the gaze behavior of Eye Tracking
- A Show Face Guidance toggle to enable or disable tips and cues on how to stay optimally positioned
Does Eye Tracking actually work?
The feature is promising, but it’s still rough around the edges
In my experience, Eye Tracking on the iPhone is equal parts impressive and frustrating. Once configured, the circular cursor that appears on-screen is fairly stable and accurate in its precision, though it does sometimes wander off on its own. Apple recommends placing your iPhone about 11.8 inches (30 centimeters) away from your face for optimal performance, which I found to help in accuracy when adhered to.
‘Dwelling’ or maintaining my gaze on a button or UI element to select it took some time to get used to, and I had to concentrate quite considerably to avoid triggering accidental inputs. Once I got the hang of things, my speed and accuracy improved quite a bit, though I could never shake off the feeling that the controls felt fiddly.
…the feature has some very real roughness around its edges, though it nonetheless manages to impress me in practice.
Thanks to the virtual AssistiveTouch button that appears whenever Eye Tracking is enabled, it’s relatively easy to access core elements and functions of the system, including Notification Center, Control Center, Siri, Scrolling, Home, and more. Considering how little the iPhone has to work with in terms of built-in tracking hardware, it’s impressive that just about anything and everything can theoretically be done in a hands-free fashion.
In its current state, Eye Tracking is very clearly intended as an accessibility tool for those with motor disabilities, rather than a mainstream navigational option that can be used as a primary input method. As such, the feature has some very real roughness around its edges, though it nonetheless manages to impress me in practice. With a heavy-handed dose of AI magic in the background, as well as perhaps some kind of IR-style tracking contact lenses, I can envision Eye Tracking one day genuinely taking off in certain contexts.
Trending Products
Zalman P10 Micro ATX Case, MATX PC Case with 120mm ARGB Fan Pre-Put in, Panoramic View Tempered Glass Entrance & Aspect Panel, USB Sort C and USB 3.0, White
Wireless Keyboard and Mouse, Ergonomic Keyboard Mouse – RGB Backlit, Rechargeable, Quiet, with Phone Holder, Wrist Rest, Lighted Mac Keyboard and Mouse Combo, for Mac, Windows, Laptop, PC
Nimo 15.6 FHD Pupil Laptop computer, 16GB RAM, 1TB SSD, Intel Pentium Quad-Core N100 (Beat to i3-1115G4, As much as 3.4GHz), Backlit Keyboard, Fingerprint, 2 Years Guarantee, 90 Days Return, WiFi 6, Win 11
Dell S2722DGM Curved Gaming Monitor – 27-inch QHD (2560 x 1440) 1500R Curved Display, 165Hz Refresh Rate (DisplayPort), HDMI/DisplayPort Connectivity, Height/Tilt Adjustability – Black
GIM Micro ATX PC Case with 2 Tempered Glass Panels Mini Tower Gaming PC Case Micro ATX Case with 2 Magnet Mud Filters, Gaming Pc Case with USB3.0 I/O Port, Black with out Followers