When Apple whispers “next big thing,” the tech world leans in. And, according to people familiar with Apple’s secretive hardware labs, that whisper is about a pair of smart glasses that could hit shelves as early as late 2026. While we’ve been dazzled by Apple Vision Pro’s immersive (if pricey) spatial computing, these spectacles promise a subtler, more everyday fusion of digital smarts with your real world—sans bulky headset.
Bloomberg first reported that Apple engineers have begun ramping up prototype production, targeting the end of this year for large-scale builds with overseas suppliers. Those prototypes will likely morph into a retail-ready device by the close of 2026.
These glasses are said to pack cameras, microphones, and tiny speakers, enabling them to “see” your surroundings and respond to queries via Siri. Imagine asking, “What’s that landmark?” or “Translate this sign,” and hearing an answer right in your ear—no phone lift required. Early reports even mention an in-house Apple chip under the hood to handle on-device processing, though full-fledged augmented reality overlays remain “years away,” suggesting that initial models will focus on audio-visual smarts rather than virtual objects floating in your field of view.
Beyond just looking cool, Apple’s smart glasses are rumored to:
- Handle phone calls: Make and receive calls without digging out your iPhone.
- Control music playback: Skip, pause, or crank up your favorite tracks via voice.
- Perform live translations: Converse in foreign languages with real-time interpretation.
- Provide turn-by-turn directions: Navigate city streets using Siri’s voice prompts.
These features echo those in Meta’s Ray-Ban smart glasses and upcoming Android XR eyewear but with Apple’s emphasis on polish and seamless ecosystem integration.
Apple’s chip ambitions for these glasses mirror its broader silicon strategy: bake in proprietary processors to offload tasks from the iPhone and keep data more private (and speedy). But unlike Vision Pro’s dual-OLED displays, early smart-glasses iterations won’t overlay digital objects on real surroundings. Instead, they’ll rely on audio and camera feeds to contextualize your environment—a stepping stone toward future AR glasses that could project virtual elements into your world, perhaps later this decade.
Apple won’t be alone in the smart-eyewear arena:
- Meta Ray-Ban: Launched late 2023 and already sold over 1 million pairs last year, proving there’s consumer appetite for discreet audio-visual wearables.
- Google’s Android XR: Partnering with Xreal, Warby Parker, Samsung, and Gentle Monster to roll out a range of AI-enabled glasses early next year.
- Startups and legacy brands: From niche AI glasses startups to luxury labels experimenting with “smart pendants” and other form factors, the market is buzzing with prototypes.
Meta isn’t resting on its Ray-Ban laurels; it’s developing versions with small displays and better AR integration, dubbed Orion, targeting a higher-end segment. Meanwhile, Google’s Android XR collaborators promise a spectrum of stylish and functional options. Apple’s edge, though? It’s a tight hardware-software marriage and Siri’s deep ties across iOS, iPadOS, macOS, and visionOS ecosystems.
This smart-glasses push coincides with a broad recalibration of Apple’s AI strategy. While the tech giant has been slower than rivals to embrace generative AI, it has quietly bolstered on-device machine learning features across iPhone 16’s Camera Control and Visual Intelligence in iOS, iPadOS, and visionOS. The glasses could serve as a showcase for these smarts—especially around contextual awareness—while keeping personal data on the device.
Interestingly, Bloomberg also reports Apple quietly scrapped plans for a camera-equipped Apple Watch, code-named “Visual Intelligence,” slated originally for a 2027 release. That project, which would have let your watch “see” and analyze your environment (think real-time workout form correction or personal safety alerts), has been shelved—perhaps in favor of integrating similar capabilities into the more versatile glasses form factor. And yes, camera-equipped AirPods still appear to be in the pipeline, hinting at a future where Apple’s headphones, glasses, and watch work in concert to sense and react to your world.
Just a day before the Bloomberg scoop, OpenAI’s Sam Altman announced the acquisition of io, a hardware design startup led by former Apple design chief Jony Ive. Sources say Altman and Ive aim to ship a screen-free AI device—packed with cameras and mics—sometime next year, offering a glimpse of how Apple’s own design guru might have approached such glasses. The io device, however, is said to be pocket-sized and non-eyewear, hinting at a diverse future for AI wearables beyond frames.
This flurry of activity—from Google’s Android XR partnerships to Meta’s Orion prototypes and OpenAI’s screenless gizmo—signals that the smart eyewear market is on the cusp of an explosion. Apple’s upcoming glasses, if they live up to the hype, could set the bar for hardware quality, user experience, and privacy standards.
Until Apple flips the switch on these glasses, we’ll be left deciphering supply-chain signals and occasional patent filings. But make no mistake: smart eyewear is Apple’s next frontier, and by this time next year, you may be asking Siri for directions through a pair of sleek, subtly glowing frames perched on your nose.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
