Imagine scrolling through a webpage or flipping through a photo gallery without lifting a finger—just by moving your eyes. According to Bloomberg’s Mark Gurman, Apple is experimenting with exactly that for its Vision Pro headset, a mixed-reality device that’s already pushing the boundaries of how we interact with technology. This eye-tracking feature, still in development, could let users navigate apps using the headset’s advanced eye-tracking capabilities, potentially redefining the user experience in spatial computing.
Gurman reported in his Power On newsletter that this feature would work across Apple’s built-in Vision Pro apps, with plans to open it up to third-party developers. While details are scarce, the idea is tantalizing. Picture this: you’re browsing a virtual storefront in the Vision Pro, and by lingering your gaze at the bottom of the screen, the page gently scrolls down. Or maybe you focus on a specific button and flick your eyes upward to move through a menu. It’s a sci-fi dream inching closer to reality.
Apple hasn’t spilled the beans on the mechanics, but we can make some educated guesses. The Vision Pro already boasts sophisticated eye-tracking tech, using cameras and sensors to pinpoint exactly where you’re looking with remarkable precision. This isn’t new territory for Apple—the company has been refining eye-tracking for accessibility features like Dwell Control, which lets users interact with interfaces by resting their gaze on specific elements. With Dwell, you can trigger actions like opening menus or scrolling by staring at an icon for a set time. But it’s clunky, requiring patience and deliberate pauses, and it’s hard to imagine Apple settling for that in a flagship feature.
Instead, the new system might involve smoother, more intuitive mechanics. For example, looking at the edge of a page for a moment could initiate scrolling, with the speed adjusting based on how far your gaze shifts. Or perhaps you’d lock onto a UI element—like a slider or button—and then glance above or below it to move the content. The challenge is making it feel natural, not disorienting. If you’ve ever tried to control a cursor with your eyes, you know it can feel like wrestling a hyperactive puppy. Apple’s likely aiming for something seamless, where the tech fades into the background, and you’re just… scrolling.
Right now, Vision Pro users have a few ways to scroll through content. The default method is a pinch gesture—using your thumb and index finger to “grab” the interface and dragging your hand up or down. It’s intuitive for anyone familiar with touchscreens, but can feel awkward during long sessions, especially if you’re holding your arms up in a virtual workspace. You can also pair a Bluetooth mouse or keyboard for a more traditional setup, or even use a game controller’s analog stick for navigation. Each method has its place, but none feel as futuristic as eye-based control.
The eye-tracking feature could be a game-changer, especially for hands-free scenarios. Imagine cooking in the kitchen, following a recipe in the Vision Pro, and scrolling through steps without touching anything. Or working in a virtual office, flipping through documents while your hands stay on a physical keyboard. It’s the kind of innovation that could make the Vision Pro feel less like a gadget and more like an extension of your body.
Eye-tracking isn’t just a cool trick—it’s a step toward making mixed reality more accessible and intuitive. The Vision Pro, priced at $3,499, is a premium device aimed at early adopters, developers, and professionals. But for it to go mainstream, Apple needs to make the experience effortless. Current input methods, while functional, can feel like a compromise between traditional computing and the promise of spatial interfaces. Eye-based scrolling could bridge that gap, offering a control scheme that’s uniquely suited to a headset.
It’s also a nod to accessibility. Features like Dwell Control were designed for users with motor impairments, allowing them to navigate devices without physical gestures. By building on this foundation, Apple could make the Vision Pro more inclusive, letting a wider range of users interact with its virtual worlds. And since Gurman notes that third-party developers might get access, we could see creative implementations in apps like games, productivity tools, or even social platforms.
This eye-tracking feature is reportedly part of a broader visionOS 3 update, which Gurman describes as a “pretty feature-packed release.” Apple’s Worldwide Developers Conference (WWDC) in June 2025 is the likely stage for its unveiling, alongside other software updates for iOS, macOS, and more. While we don’t have specifics on what else visionOS 3 might bring, the Vision Pro’s first year has already seen steady improvements. Since its launch in February 2024, Apple has rolled out updates to enhance 3D content, improve hand-tracking, and integrate the headset more tightly with Mac ecosystems.
There’s also buzz about hardware. Gurman and others have speculated about a lower-cost Vision Pro model, possibly arriving in 2026, which could broaden the device’s appeal. Eye-tracking features like this one might play a key role in making a cheaper model feel premium, even if it cuts corners on displays or processing power. For now, though, the focus is on software—and making the current Vision Pro as polished as possible.
Apple isn’t alone in exploring eye-tracking. Companies like Meta, with its Quest headsets, and Sony, with the PlayStation VR2, have experimented with similar tech, though mostly for gaming or accessibility. What sets Apple apart is its knack for taking niche features and making them feel indispensable. Think of the iPhone’s multitouch gestures or the Apple Watch’s Digital Crown—both were novel at the time but quickly became second nature. Eye-based scrolling could follow that path, especially if Apple nails the execution.
There’s a broader trend here, too. As mixed reality evolves, we’re moving away from traditional inputs like mice and keyboards toward interfaces that leverage our bodies—hands, voice, and now eyes. It’s a shift that could redefine computing, not just for headsets but for any device that blends digital and physical worlds. Apple, with its massive ecosystem and obsessive focus on user experience, is well-positioned to lead that charge.
Plenty of questions remain. How precise will the scrolling be? Will it work in fast-paced apps like games, or is it better suited for leisurely browsing? And what about eye strain—will staring at virtual pages for hours leave users bleary-eyed? Apple’s engineers have their work cut out for them, balancing innovation with comfort.
There’s also the question of developer adoption. If third-party apps can tap into this feature, we could see a wave of creative uses—but only if Apple makes it easy to implement. The Vision Pro’s app ecosystem is still growing, and while heavyweights like Microsoft and Adobe have jumped on board, Apple needs to keep the momentum going.
For now, the eye-tracking scroll feature is just a rumor—one backed by a credible source, but a rumor nonetheless. If it pans out, it could be a small but significant step toward making the Vision Pro feel magical. At WWDC 2025, we’ll likely get a clearer picture of Apple’s plans, not just for this feature but for the future of spatial computing.
Until then, the idea of scrolling with your eyes is a reminder of why the Vision Pro exists: to push the boundaries of what’s possible. It’s not perfect, and at its current price, it’s not for everyone. But with each update, Apple is inching closer to a future where mixed reality feels as natural as swiping on a phone. And if they pull off eye-based scrolling, that future might be closer than we think.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
