If you have ever tried to send a quick text from a crowded subway or glance at your speaking notes without obviously looking down at your phone, Meta’s latest update to its Ray-Ban Display glasses is aimed squarely at you. The company is rolling out two new tricks — a teleprompter mode and “virtual” handwriting via its Neural Band — that push these glasses a little further into the future that smart wearables have been promising for a decade.
At a basic level, Meta is solving a very human problem: talking to technology in public is awkward. Until now, if you wanted to reply to a WhatsApp or Messenger message on Ray-Ban Display, you were mostly stuck with voice dictation or canned responses, which is fine at home but feels weird in a café or on a quiet train. The new EMG handwriting feature flips that dynamic by letting you keep your mouth shut and simply “write” with your hand on any surface while wearing the Neural Band on your wrist. Tiny electrical signals in your muscles are picked up by the band, interpreted as letters, and translated into text that shows up in your in-lens display, ready to send. Meta is pitching this as the first wrist device that can do neural handwriting on any surface, and for early adopters, it gives glasses a surprisingly old-school, pen-and-paper vibe — just without the pen or the paper.
Right now, handwriting is limited: it’s in early access, only in English, and only for users in the US, and it works with WhatsApp and Messenger rather than every app on your phone. But the core idea is powerful. You could be sitting in a meeting, tracing letters with a finger on the table, and quietly sending a full, custom reply without anyone realizing you are texting. For people who hate voice dictation, or creators trying to jot a line while filming hands-free, this is a genuinely new input method, not just another gesture shortcut.
The other big addition, the teleprompter, is where the “Display” part of Ray-Ban Display really starts to matter. Meta now lets you copy text from your phone — from Google Docs, a notes app, or even an AI-written script — and beam it into your glasses as a series of text cards that float in front of your eye. You navigate these cards using the Neural Band, so you can scroll through your script with subtle gestures instead of fumbling with a phone screen mid-presentation.
For anyone who has ever tried to deliver a talk, record a YouTube video, or host a live stream while keeping steady eye contact, this is instantly appealing. The system supports up to around 16,000 characters of text — roughly half an hour of spoken material — which makes it useful not just for one-liners but for full presentations or longer creator monologues. In practice, it turns your glasses into an almost invisible confidence monitor, showing your timer, current time, and lines of text while letting you look straight at your audience or camera.
It is also a clever way for Meta to make the Neural Band feel less like an optional extra and more like part of the core experience. With both handwriting and the teleprompter, the band becomes your primary control surface — a quiet, wrist-based remote that lets you move through notes, pause, resume, or answer messages with small gestures. For creators, that combination could be compelling: imagine shooting vertical video with your glasses’ camera, reading your script in-lens, and tweaking phrasing on the fly by handwriting new lines with a finger on the table.
Beyond communication and scripting, Meta is also nudging Ray-Ban Display toward being a more practical everyday navigation tool. The Pedestrian Navigation feature — which overlays walking directions in your field of view — is still in beta but is now expanding to four more US cities: Denver, Las Vegas, Portland, and Salt Lake City, bringing the total to a few dozen supported locations. On paper, it is one of the most obvious use cases for AR-style glasses: look around a new city with arrows and turn-by-turn cues hovering in front of you, instead of staring down at your phone like everyone else.
All of this comes with a bit of tension around availability. Meta has quietly paused its international rollout of Ray-Ban Display, delaying launches in markets like the UK, France, Italy, and Canada even as it debuts these new features at CES. So the people who might benefit most from discreet, on-the-go tools — journalists, on-camera hosts, business travelers — largely have to be in the US to even try this out. It is a very Meta move: push the software envelope quickly, but keep the hardware on a tighter leash while the company figures out regulation, support, and demand country by country.
Still, taken together, the teleprompter, EMG handwriting, and expanded navigation paint a clearer picture of what Meta wants Ray-Ban Display to be. Instead of just “smart sunglasses with a camera,” these start to feel like a low-key productivity and creativity tool — something you might reach for before a big pitch, a livestream, or a day exploring a new city. They are not full-blown mixed reality headsets, and they are still bound by battery life, display constraints, and a limited app model, but they are inching into territory where a regular person can see the appeal beyond novelty.
For now, these features will live in that early adopter zone: people willing to wear logoed frames with a camera and explain, again, that yes, they are recording. But if Meta can make handwriting feel natural, keep the teleprompter readable outdoors, and continue to add genuinely useful, not gimmicky, features, Ray-Ban Display edges closer to the dream pitch for smart glasses: they help you stay present, informed, and on-script, while everyone else just thinks you are wearing a regular pair of shades.
View on Threads
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
