Meta showed off its new smart glasses with a built-in display, and while everyone is rushing to say these beat Apple to the punch or compete with visionOS, neither assertion is accurate.
Apple Vision Pro debuted during WWDC 2023 as a mixed reality device, something Apple dubbed "spatial computing." It projects software into your environment by feeding multiple camera feeds into two tiny displays.
Eventually, Apple Vision Pro and visionOS will give way to Apple Glasses with transparent lenses — true AR. What Meta revealed on Wednesday isn't that.
And, it's not even close.
Something bugged me about the Meta Ray-Ban Display announcement, and I couldn't put my finger on it. It wasn't Zuckerberg's continued awkwardness, failed demos, or the $800 price tag — it was the device itself.
The AI glasses market
Meta is describing Meta Ray-Ban Display as "AI glasses," which are notably not AR, spatial, or whatever moniker you'd like to apply. They are an extension of the Ray-Ban Meta glasses that feature speakers and a camera.
Apple doesn't sell a product like Meta's glasses, and it doesn't seem too interested in it, in the short-term. Rumors point to a potential 2026 set of Apple "AI glasses" that compete with Meta's, but those rumors don't have much weight behind them yet.
Remember, we've supposedly been one year away from an iPhone Fold since about 2019. Not to mention, Meta's glasses business isn't exactly gangbusters at about 2 million units from October 2023 to February 2025.
It just doesn't seem like a market Apple cares about when its closest competitor to Ray-Ban Meta is AirPods.
Adding a small monocular display to the glasses that already have speakers and cameras is a smart next step. The utility isn't lost on me, but as a happy Apple Watch and AirPods user, it feels redundant.
After combing through commentary on these new glasses and the portions of Meta's presentation I watched, it seems really important to Meta that people feel like the glasses are unobtrusive and discreet.
Allow me to burst your bubble — they're invasive and ugly in a way that'll ensure I'll ask you not to wear them around me.
If Meta had stopped at making the Meta Ray-Ban Display a simple glanceable interface with mild voice interactions, that would have been enough. However, they've no interest in making this a simple accessory to your smartphone.
Instead, Meta wants the glasses to be filled with "AI" apps provided by third-party developers with a tether to your smartphone as an afterthought. I'm not sure if Zuckerberg has heard, but every attempt that a non-smartphone maker has made to introduce an AI-first device has either failed and is a gravestone along the road, or is destined for failure.
At least this one has a display.
Meta Ray-Ban Display is what it says on the tin
I'm glad that Meta isn't calling these glasses AR, because they aren't. They aren't even unique, as glasses that can project a user interface have been a thing since at least 2022.
What does surprise me is all of the punditry calling Meta's latest "AI glasses" some kind of thrown gauntlet at Apple — a company that has shown little interest beyond patents in making such a device. No, Meta Ray-Ban Display isn't a precursor to Apple Glasses nor is it even an Apple Vision Pro competitor.
At best, they're competing with the iPhone. Meta hopes that instead of looking at a crystal-clear smartphone or smartwatch display, users will want to see a semi-transparent widget fixed floating two feet in front of their right eye.
Apple's Vision platform isn't about acting as a simple notification repeater, nor is it about replacing traditional computing platforms — at least not yet. At its launch and in its current iteration, it is about augmenting user interfaces.
Apple Vision Pro is a computer on its own capable of doing quite a lot, most everything achievable on an iPad Pro minus the touchscreen tablet part. However, when used with a Mac, it becomes an incredible utility for not only displaying what's on the Mac, but augmenting the user space with other apps and widgets.
Meta's Display attempts to replicate smartphone interface elements on a tiny display that is operated via hand gestures transmitted through a bracelet. It's all very possible with today's technology, and Meta has done it, but it seems they've forgotten to ask why they should do it.
The few million that already own Meta's glasses are unlikely to run out and spend $800 for a worse display and experience than the one they already get with their smartphone. Not to mention, they're an eyesore and a privacy nightmare.
Instead, they'll continue to utilize the actual utility found in the Ray-Ban Meta glasses they already have. A wearable camera and a smart assistant connected to your smartphone isn't a bad combo, even if it is still a niche audience.
But once you've introduced an entire user interface that hopes to replicate or replace your smartphone, suddenly, the previous utility is lost. The accessory that tries to become a standalone product is one that flies too close to the sun.
Where visionOS is going
As a tech person, I will say that Meta has made something interesting. Just because they're interesting or novel doesn't mean they are useful.
And if Apple released the same product today, I'd be asking the same questions. However, I do not believe Apple would release this product, nor will it in the future.
Apple Glasses are a completely different paradigm and won't be ready for some years. The rumored glasses that may arrive in 2026 aren't even part of the visionOS platform — they're just AirPods with cameras.
If we follow Apple's Vision platform to its logical conclusion, it is a set of transparent lens glasses that have the ability to project visionOS into the real world. They wouldn't be a crude fixed 2D window, but a fully realized AR platform.
Imagine an all-day wearable set of glasses that let the users view widgets, windows, and objects placed around a space. Notifications and apps are a part of this, like they are in visionOS, but the physical objects like iPhones, iPads, Home Hubs, and more are still there and necessary.
Instead of replacing your iPhone, Apple Glasses would work with your entire Apple ecosystem to augment and expand the capabilities and interactions of every device in your world. And they'll do it with glasses you'll actually want to wear.
It'll be interesting to see how Meta iterates on its "AI glasses" concept in the intervening years before true Apple Glasses arrive. Though, there's always a chance Meta quietly abandons the whole thing and finds some other investor-friendly target to chase.
In the meantime, us Apple users can go ahead and enjoy augmenting reality with AirPods Pro and Apple Watch. And once Apple's true AR product arrives, we can bet it'll be private, secure, and actually useful to its users.













