Amazon has quietly moved one step closer to a future where delivery drivers wear heads-up displays instead of staring at phones, unveiling a prototype system for its couriers that the company says puts real-time navigation, package scanning, and hands-free proof-of-delivery into the driver’s line of sight.
The glasses, shown by Amazon in a short company blog post, combine a small display in the wearer’s field of view with an always-on camera and tight integration with a controller mounted on a delivery vest. In Amazon’s demo footage, the device overlays walking directions, shows which package to take from a van, and lets a driver capture proof-of-delivery photos without digging out a phoneAbout Amazon. Amazon says the system is designed to reduce time spent looking down at handheld devices and to make work in tricky environments — gated complexes, multiunit apartment buildings, poorly lit yards — safer and faster for drivers.
These features have been framed as practical fixes: package identification inside a crowded van, turn-by-turn guidance from van to doorstep, and a button on the vest that triggers a hands-free photo capture at delivery, a workflow meant to eliminate the awkward “please don’t take your package yet, I have to take a pic” interactions that customers sometimes experience.
Amazon’s images suggest a two-camera layout — one camera mounted centered above the nose and another near the temple — and the glasses come paired with a vest that houses a swappable battery and a tactile controller with a dial and an emergency button. The frames appear slimmer than many early AR prototypes, and Amazon says the lenses are transition-capable and support prescription inserts, signaling the company is thinking about comfort and long shifts in varied lighting.
Reporters and outlets that have examined Amazon’s reveal note that the Amelia delivery glasses are distinct from a separate Amazon consumer project codenamed Jayhawk, which is expected to sport a slimmer profile and a consumer-oriented feature set down the line.
Testing and how soon drivers might see them
Amazon says hundreds of delivery workers have already tested early versions of the glasses, and the rollout timeline remains vague, limited to the company’s announcement that it intends to continue development and expand AI-driven features over time. Outside reporting earlier this year suggested pilots and internal codenames for both delivery and consumer efforts, and industry coverage has pegged broader consumer ambitions for Amazon in the AR glasses market in the 2026–27 window.
For now, the glasses are explicitly a task-oriented tool for Delivery Service Partner drivers rather than an item Amazon expects consumers to buy immediately, though the same technologies often migrate from enterprise to consumer markets over the years.
Features Amazon wants to add next
Amazon’s write-up hints at future capabilities that lean heavily on computer vision and on-device AI: real-time defect detection to spot wrong-drop deliveries, alerts for hazards such as low light, automatic lens adjustments, and even pet detection to warn drivers before they approach a porch. The company frames those as safety and accuracy improvements, and it emphasizes the role of driver feedback in shaping the devices during testing.
Independent coverage underscores that this is the expected trajectory for industrial AR wearables — start with narrowly scoped, productivity-boosting tools and incrementally layer on context-aware intelligence.
Ethical and privacy questions that weren’t addressed
Amazon’s blog post does not grapple with the thorny questions that naturally follow whenever constant cameras and automated monitoring are introduced to frontline work. Critics and labor advocates will likely ask how video and sensor data are stored, who can access footage, whether the glasses create new surveillance pressure on drivers’ schedules and behavior, and how mistakes made by on-device AI will be handled in disputes with customers or among workers.
The broader debate is familiar: employers can legitimately argue that body-mounted cameras and vision systems reduce errors and improve safety, while workers and privacy advocates worry that continuous monitoring can erode autonomy, shift the burden of proof onto employees, or be used in ways that affect compensation and discipline. Amazon’s public announcement leaves those trade-offs largely unaddressed.
What this means for consumers and the wearables market
In the short term, customers are most likely to notice incremental improvements in delivery speed and fewer fumbling phone checks from drivers once pilots scale, but the more consequential effects will appear in how data from these systems is governed and how workers experience the shift toward AI-assisted labor.
In the longer run, Amazon’s delivery glasses also function as a public testbed for optical, sensing, and AI stacks that could feed a consumer device down the line — the so-called Jayhawk project — which industry observers expect to be a slimmer, feature-rich product aimed at general users sometime in the 2026–27 era. If Amazon successfully migrates the core capabilities from a workplace tool into a convincing consumer package, it will ratchet up competition with companies already shipping AR eyewear and place Alexa and Amazon services squarely into a new wearable interface.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.


