iFixit tore open Meta’s new Ray-Ban Display glasses and what stands out isn’t a mini computer or a clever camera — it’s the actual lenses. The glass is doing a lot of the heavy lifting, and that design choice has some neat user benefits and some gnarly downsides.
Walk around a coffee shop wearing a pair of smart glasses and most people expect to see a tiny camera hump, a microphone array or maybe some LEDs. The Ray-Ban Display hides its headline feature where sunglasses traditionally do the least work: in the lens. iFixit’s teardown shows the right lens contains a reflective, geometric waveguide — basically a lattice of micro-mirrors embedded in the glass — that routes light from a tiny projector in the temple straight to your eye while keeping that light largely invisible to anyone looking at you. The benefit is simple: the person across from you won’t see your notifications floating over your face.
How it works
The projector lives inside the right arm — iFixit identifies it as a liquid-crystal-on-silicon (LCoS) micro-projector that mixes light from three LEDs to form a 600×600 image. That light is injected into the edge of the lens and hits the geometric waveguide, which uses partially reflective mirrors to bounce the beam out only at specific angles so the wearer sees the image. It’s a neat marriage of old projector tech and new lens engineering.
This reflective, geometric approach contrasts with the diffractive waveguides used in earlier AR glasses. Diffractive systems bend and split light using etched gratings and can create annoying rainbow artifacts or “eye glow” for onlookers — that iridescent shimmer you sometimes notice in photos. The Ray-Ban Display’s method reduces those artifacts by controlling reflections instead of relying on diffraction. That’s why the glass itself — not the chipset or the camera — is the real headline in this teardown.
The tradeoffs: gorgeous optics, expensive glass
There’s always a cost. Geometric, mirror-based waveguides are materially and procedurally more expensive to produce than simple plastic optics. iFixit and several outlets note that the specialized glass likely drives much of the product’s bill of materials — enough that iFixit speculates Meta might be subsidizing the hardware or selling at a loss to get the tech in people’s hands. That’s plausible: optics like these require precision manufacturing and often come from a small set of suppliers with niche tooling.
For users, the upside is obvious: a clearer, less artifact-prone image and better privacy for onlookers. For Meta, the downside is an expensive part that may not scale easily or cheaply if demand spikes.
The rest of the hardware (and how ordinary it looks after the lens reveal)
Under that elegant glass is familiar wearables plumbing: Meta’s Ray-Ban Display runs a Snapdragon AR1-class chip with modest RAM and local storage, a 12MP camera, open-ear speakers, a mic array, and the usual Bluetooth and companion app dance. Battery life clocks in around the kind of figure you’d expect for a full-feature wearable — iFixit measured a roughly six-hour mixed-use figure and found a 960 mWh cell inside. So yes, it behaves like a modern smart wearable; the lens is what tries to make it feel like “AR” rather than “phone on your face.”
The repairability problem (and why iFixit is unimpressed)
Here’s where the teardown gets uglier. To get at the guts, iFixit had to literally split the arms and frame — Meta didn’t design the glasses with repair or easy part swaps in mind. Internal ribbon cables, sticky adhesives and components glued into thin plastic shells mean that a battery swap, lens replacement or other routine repair isn’t a simple “pop-off” job. iFixit’s teardown tech Shahram Mokhtari summed it up bluntly: these first iterations are effectively unrepairable without specialized tools and skills. That raises usual sustainability flags: expensive device, disposable by design, and no replacement parts on day one.
Why this matters beyond one pair of shades
We’re at a recurring crossroads for consumer AR: do you prioritize optical quality and the small set of companies that can produce it, or do you prioritize repairability, modularity and cheaper scale? Meta’s choice — to bet on better glass today — makes sense if your immediate product goal is a pleasant, artifact-free AR experience that actually looks like something people will wear in public. But it also locks the hardware into a narrow supply chain and a repair model that channels users toward replacement rather than recycling.
There’s also a privacy flip side. On the one hand, lenses that prevent onlookers from seeing your display protect you from curious strangers. On the other, embedding high-quality optics and cameras into everyday eyewear raises familiar questions about covert recording and social norms: when does “invisible” content become creepy? The optics don’t answer that; design and policy do.
The takeaway
If you’re excited about consumer AR that looks and acts more like normal glasses than a glowing HUD, the Ray-Ban Display shows a practical route forward: put the cleverness into the lens. But the polish comes with tough tradeoffs — expensive glass, limited repairability, and a product that seems designed to be replaced rather than repaired. For early adopters who want the cleanest, least distracting AR experience available right now, that’s a fair trade. For anyone thinking about longevity, sustainability, or a sensible repair ecosystem, it’s a reminder that engineering beauty sometimes arrives with a brittle hinge.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
