Apple is building glasses without a display. No augmented reality, no Vision Pro successor. Microphone, camera, Siri – done. Anyone reading this as a retreat hasn't understood what's happening here.
The Vision Pro was technically impressive. And yet it sits on the shelf for most buyers. Apple charged over $3,000 for a device you don't wear at a café, in meetings, or on the street. A product nobody puts on doesn't solve a problem – no matter how good the displays are.
These new glasses are the correction. And corrections require courage.
#What Mark Gurman reports – and what it means
Bloomberg reporter Mark Gurman writes that Apple is testing four designs: a large rectangular frame, a slimmer variant similar to Tim Cook's glasses, a larger oval shape, and a smaller oval variant. Colors include black, ocean blue, and light brown. Launch planned for 2027, possible announcement this year.
No displays. Instead: take photos and videos, answer calls, listen to music, talk to Siri.
That sounds like less. But it's more precise. Apple has decided what these glasses can't do – and that's the actual design decision.
#Meta delivered the proof
The Ray-Ban collaboration with Meta isn't a hype product. It's an everyday object. People wear them jogging, shopping, at the office. No sci-fi aesthetic, no fear of others' glances.
The market responded – positively. Meta demonstrated that audio-first wearables work when they look like normal glasses. Apple observed this. Now Apple is following – with its own ecosystem, its own chip, its own Siri upgrade.
This isn't imitation. Apple is jumping on a concept that's already proven itself and building it out with its own resources. That's product strategy, not weakness.
#Why the display was the problem
A display on glasses changes everything: the shape, the weight, battery drain, heat generation, the price. And above all: the social dynamic. Anyone wearing AR glasses sends a signal. Not everyone wants to send that signal.
Google Glass failed because of this. Not because of the technology – because of rejection. People didn't want to be seen as "Glassholes." The device was too visibly foreign.
Glasses that look like glasses overcome this barrier. They're invisible in the best sense: they don't stand out. That's precisely the advantage.
#What this means for UX and product design
If you're a designer or developer with wearables in mind, you need to rethink now. The entry point is shifting. Visual interfaces are no longer at the center, but audio and passive sensors.
This concretely changes how you think about UX:
- No screen, no tap. Interactions happen via voice or gesture. Feedback comes through sound, not pixels.
- Context becomes more important than content. The device must understand when to chime in – not just what it says. A poorly timed Siri notification is worse than none at all.
- Passive beats active. The best wearable features are those the user doesn't consciously trigger. Automatic call answering, ambient noise reduction, location context – it runs in the background.
If you're building apps or products today and ignoring wearables, you're building for a shrinking market.
#The most honest product in years
With the Vision Pro, Apple showed where excessive ambition leads: a product nobody buys because it wants too much. The new glasses show the opposite. Apple isn't asking "What's technically possible?" – but "What will someone actually wear?"
That's the question driving good product design. Not features, not specs, not roadmaps. But: Will someone pick this thing up tomorrow morning and put it on?
If the answer is yes, the product has won. Everything else is an exhibition piece.
Apple is riding a bike first before building rockets again. And that's exactly the right move.