Smart Glasses Are Everywhere. The Use Case Isn't.
After a year testing dozens of AI-powered eyewear products, The Verge finds the category's main appeal is discretion — and that's not enough to sustain a market.
Last verified:
The most revealing thing about today’s AI-powered eyewear isn’t what it does — it’s how it does it without anyone noticing. The best products in this category earn their appeal through invisibility: interacting with an AI layer while looking, to everyone else, like a person simply wearing glasses. That’s a genuinely novel experience. It’s also nearly the entire value proposition for this hardware generation, and a category can’t sustain itself on covert novelty alone.
According to The Verge, which has spent a year evaluating a strikingly large collection of competing hardware — including the Even Realities G2, Meta Ray-Ban Display and Neural Wristband, Rokid, Xreal, RayNeo, Lucyd, and Oakley Meta HSTN — the smart glasses market has arrived at a peculiar stalemate: rapid product proliferation without meaningful differentiation, and sweeping ambition without follow-through.
A Crowded Shelf of Indistinguishable Hardware
Manufacturers across the segment make nearly identical promises. AI eyewear will monitor dietary habits to improve health, passively transcribe ambient conversation into notes, and reshape environmental inputs into music queues or spontaneous social activity recommendations. The Verge reports that after extensive real-world use, none of these features have materialized into reliably useful, day-to-day experiences. Compounding the problem: the hardware itself has converged on a near-identical design language — chunky frames, discreet cameras, open-ear audio — making differentiation on aesthetics nearly impossible.
Where These Devices Work, and Why That’s Insufficient
The genuinely impressive moments are instructive. The Even Realities G2 can be operated via a companion smart ring, surfacing a discreet display visible only to the wearer — useful for private prompting in a social setting, with no external tell. The Meta Ray-Ban Display offers comparable eyes-only interaction. These are real, novel capabilities.
But they also reveal the category’s structural limitation: the standout feature is going undetected. That’s a discretion story, not a productivity breakthrough. The Verge also flags the uncomfortable inverse — testing the Meta Ray-Ban Display led to the unintentional capture of a bystander on video, a reminder that covert capability is ethically symmetric: it conceals from both intended targets and accidental ones.
Why This Matters
Smart glasses are the most tangible frontier of ambient computing — the long-anticipated shift away from screen-mediated interaction toward always-on, embodied AI that recedes into daily life. If this hardware generation fails to articulate a durable use case beyond the quiet-spy sensation of hidden tech, it risks foreclosing the broader ambient computing narrative before it reaches mainstream credibility. The devices exist. The story — the “why wear this all day” story — still doesn’t. In consumer hardware, that gap is often fatal.
Frequently Asked Questions
Do any smart glasses on the market in 2026 actually deliver on their AI promises?
According to The Verge's year-long review, none of the current crop — including Meta Ray-Ban, Even Realities G2, Rokid, and others — consistently deliver on marketed capabilities like health tracking, ambient note capture, or AI-driven personalization.
What distinguishes the Even Realities G2 from other smart glasses?
The G2 pairs with a companion smart ring for gesture-based control, letting wearers interact with a private display that remains invisible to anyone else in the room — a notable differentiator in an otherwise homogeneous hardware category.