Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple A-Z » Accessories and I/O

Apple Glass

Apple Glass

A future wearable from Apple may be a set of augmented reality glasses called "Apple Glass." One day soon, users will interact with information and digital objects via transparent glass worn over their eyes. Apple Vision Pro and visionOS are the first step on a long journey to this product.

● Far future launch
● Plastic or metal frames
● Will display information on both lenses
● iPhone-dependent for data
● No cameras, just LiDAR
● Gesture controls expected
● UI called Starboard
● Prescription lenses available


Get Apple News Directly in Your Inbox


If you were to believe science fiction, the future of wearables has always been augmented reality. Apple could very well be close to achieving that future with a wearable heads-up display called "Apple Glass."

Apple has a long way to go before it can release a sophisticated pair of glasses that display information on the lenses. They need to be lightweight and somewhat stylish while still being a computing device that relies upon battery power.

While it may be some time before Apple can achieve such a product, the Apple Vision Pro is the first step in that direction. The visionOS platform and headset are part of Apple's push into spatial computing.

Rumors have focused on Apple's mixed reality headset in recent months, so much of what follows is based on information shared between 2020 and 2022. Now that Apple has revealed the Vision Pro, it is easier to see what an eventual AR product will look like.

There have been rumors about wearable AR glasses for years, but very few useful tidbits leaked until 2020 — which may not have been about glasses at all. Patent applications have been the best source of information yet, and show some promising features for an Apple headset.

Apple Vision Pro is the first step towards full AR glasses Apple Vision Pro is the first step towards full AR glasses

The following features are a combination of rumors, patents, and leaks that represent our best look at what "Apple Glass" might be. Many of Apple's competitors have begun making products that will compete with Apple's eventual heads-up display, but none have reached the sophistication of the rumored Apple product.

Facebook, now Meta, has pushed into VR with the Meta Quest alongside other competitors from Valve and PlayStation. Apple's headset and visionOS aren't quite VR or even AR, but a mix of both that competes in its own space. However, it is expected to lead to an AR platform.

As promising as Apple's initial implementation of the technologies in Apple Vision Pro are, the expense, size, weight, and volume indicate a long road ahead. While nothing is stopping Apple from releasing a simple set of glasses, rumors suggest the company has its sights set higher.

Getting to glasses

A lot of technological problems will need to be solved for Apple Glass to exist. Apple Vision Pro launched in 2024 as the best that could be accomplished in its price range at that time.

Prescription inserts for Apple Vision Pro on a table Glass lenses could eventually be the entire product instead of an accessory

Competitors offer simple wearable glasses with AR interfaces, but these are basically tiny projectors and not the futuristic displays expected for true AR. Google Glass was a hint of what may be possible, and Apple could build something like that today, but it would be pointless with Apple Watch on our wrists.

Transparent OLED televisions have recently entered the technology space. This technology is part of what will make true AR glasses possible. It will take many years to miniaturize the displays to a feasible wearable state.

The current solution for AR is simply passthrough. Cameras capturing the world and showing it to the user on tiny displays inches from their eyes. That's why Apple Vision Pro is a headset rather than a pair of gogggles — you need isolation for that to work.

True AR won't be isolating. Information and displays will be visible as if they are part of the world. visionOS is built with this in mind.

Whenever Apple can reach this technological threashold, it will likely be a revolution for wearable tech. And there will already be a platform and App Store ready when it launches.

Man wearing smart glasses with a serious expression, background of shelves and shadows. Meta Orion are a pair of AR glasses that hope to pass as regular specs

After five years of secret development, Meta revealed its prototype Orion glasses that work on projector technology. The UI is similar to the floating windows for Apple Vision Pro and Meta Quest, but seen in true augmented reality.

While projectors are a good way to get the job done in 2024, it seems unlikely that Apple would go that route. Plus, Orion requires a wristband and external compute unit that could easily be swapped for an Apple Watch and iPhone.

Apple Intelligence could play a crucial role in making Apple Glass a viable product. Natural voice control, environmental understanding, voice recognition, and translation features would all be accelerated by using AI instead of ML.

Inaccurate early rumors

Jon Prosser claimed to have seen the early prototypes of the glasses and called them "sleek" in a June 2020 leak. He estimated a 2021 release, although none were ever announced during that year.

A rumor came from Prosser on May 21, when he said there would be a "heritage edition" set of glasses designed to look like the ones worn by Steve Jobs. Bloomberg's Mark Gurman felt the need to step in and say that all rumors up to that point were false.

Gurman asserts that there are two distinct devices, as reported by AppleInsider over the years, one is the purported glasses, and the other a VR headset — which turned out to be Apple Vision Pro. Prosser agrees there are two devices but did not agree with Gurman's lengthy release timeline of 2023 for the glasses.

Time has proven that Prosser was wrong on nearly every count for the glasses rumors. Whatever information was leaked to him it likely was early prototypes for Apple Vision Pro or fabrication.

Design

There haven't been leaked photos of the actual design, but it is rumored that Apple wants these glasses to look fashionable and approachable. Apple Watch is a good place to look for how Apple handles wearable design— subtle, but still obviously a piece of tech.

Much of what has been shown in patents look like safety glasses, though these are prototype drawings meant to illustrate the patent and not the product. Ultimately "Apple Glass" could look like an average pair of glasses, but there is no way of knowing until something more official leaks out.

Apple's AR glasses may have modular parts Apple's AR glasses may have modular parts

Designing a tech product that users will want to wear on their faces is no simple task. Style, color, and even lens shape will make or break most purchasing decisions, and Apple is a company known for its one-size-fits-all approach to many of its products.

Sony may be supplying half-inch Micro OLED displays for an Apple headset at 1280x960 resolution. The order for the displays is expected to be fulfilled by the first half of 2022, according to sources. These may likely be for the Apple VR headset, however.

A patent revealed late in 2020 points to an Apple VR or AR headset automatically adjusting the lenses placed in front of the user's eyes by using fluids to deform the shape of the lens to improve the user's eyesight. The patent suggests a series of lens components around a central fluid chamber that can be inflated and emptied by a connected pump and reservoir.

Later rumors pointed to delays in manufacturing the Apple glasses. At one point, analyst Ming-Chi Kuo says the "Apple Glass" may not be ready until 2025, with a set of AR contacts set for a 2030 launch.

Those timelines may need to be adjusted again because Vision Pro is large and heavy, requires cooling, and only lasts two hours on an external battery pack. It seems we're many years out from stylized AR glasses.

Processing Capabilities and Battery Life

Wireless signals, smart displays, microphones, powerful processors, and LiDAR add up to a device in need of a big battery. If Apple wants a device that everyone wants to wear, it not only has to look good, it has to perform. A massive battery and hot processor just won't cut it, so Apple will have to find a balance.

One aspect Apple can cut back on is processing power. As with the first-generation Apple Watch, the smart glasses could rely upon the iPhone for all processing needs and only display that information.

Pairing Apple's smart glasses should be as simple as AirPods Pairing Apple's smart glasses should be as simple as AirPods

By relaying information from the phone to the glasses, Apple will drastically cut down on local processing and need only worry about powering the display and sensors. This could accelerate a release of such a product, but 2025 is still too soon, and it would end up with the same problems as the original Apple Watch.

One patent filed by Apple shows a series of base stations and IR tracking devices that could be used to process data and transfer information to "Apple Glass" or a VR headset. These tracking tools being offloaded to a dedicated base station would allow better tracking and less battery usage for the wearable. Think of a museum being able to follow a user around and show relevant data in the glasses with the base stations doing all the work.

Jony Ive once stated that a product could be in development for years, waiting for the technology to catch up with the idea. Apple is likely to take the same approach here, developing the AR glasses into different iterations internally while the technology is allowed to mature.

Apple's AirPods are a good example of a super compact device with good battery life. Even as small as the AirPods Pro, they last for several hours with ANC enabled.

The glasses will need to serve the customer as a piece of tech and fashion, but will also need to serve a third function for many– work as actual glasses. It is expected that Apple will offer an option to order a prescription lens, but a patent describes another option. The lens themselves could adjust for the user wearing them.

Privacy and "Apple Glass"

Apple could avoid socially awkward cameras and use LiDAR to display content overlays for users. AR glasses would need to understand the environment to overlay information, but unlike Apple Vision Pro, wouldn't need full video capture.

Apple likely to rely upon LiDAR and other sensors instead of cameras Apple likely to rely upon LiDAR and other sensors instead of cameras

Apple tends to sell products with a lot of overlap, but the iPhone has always been the Apple camera, which isn't likely to change. Unlike Google Glass, which seemed to want to replace the smartphone entirely, Apple's product will augment the iPhone experience.

Another expectation is that only the wearer will be able to view the content on the glasses so that a random passerby cannot peek into your business.

Apple has also looked into using "Apple Glass" for authentication. Rather than utilizing the built-in biometrics on your iPhone, the headset could detect if the wearer is looking at the device and unlock it immediately. The feature would only work after authenticating the wearer when putting on the glasses for the first time, much like Apple Watch.

The Vision Pro headset uses Optic ID for authentication, which surely would come to Apple's glasses. The headset requires many cameras to work, but it also needs the cameras for passthrough, which shouldn't be required for transparent glass lenses.

Starboard User Interface

The iPhone has Springboard, the set of icons that act as your home screen. Apple's glasses have "Starboard." No interface elements have leaked or even been described, but it is assumed that Apple will adapt its iconography and UI for an AR interface.

The visionOS interface The visionOS interface

Now that visionOS has been revealed, it seems this may be the operating system and interface destined for Apple Glass. The transparent objects with 2.5D effects are quite striking.

Code surrounding the testing of such a UI was found in the iOS 13 Release Candidate. STARTester code and references to a device that could be "worn or held" were found in a readme file. However, not much came from this but does corroborate the rumored "Starboard" UI name.

The LiDAR sensor will allow for gesture control without the need for a controller or marker. However, some patents have suggested that Apple might be making a controller for more interactive experiences, like games.

Google Glass (left) has a tech-focused style Google Glass (left) has a tech-focused style

As the first generation, expect most experiences to be passive. Look to Google Glass for this one; expect to see incoming iMessages, directions overlaid in real life, and highlighted points of interest.

While there won't be a camera to guide these experiences, LiDAR plus geolocation, compass direction, head tilt, eye tracking, and other sensors would go a long way in ensuring accuracy when displaying AR objects.

The iPhone and Apple Watch could also act as anchors for AR interactions. Users looking through their AR glasses would be the only ones able to see what is being displayed, which would give them greater privacy.

AR features in iOS show potential for "Apple Glass"

Apple has been pushing augmented reality for years and new features are added to ARKit with each version of iOS. These building blocks all add up to what could be an operating system and software features built for "Apple Glass."

ARKit

Location anchors will let developers and users attach augmented reality objects to certain locations in the real world. Wearing an AR headset or set of AR glasses while walking around town would let you see these objects without further interaction. On-device machine learning will be able to anchor objects by using known map data and building shapes precisely.

LiDAR makes positioning objects without physical markers fast and easy LiDAR makes positioning objects without physical markers fast and easy

The new Depth API was created to take advantage of the LiDAR system on fourth-generation iPad Pros which could be included on "Apple Glass." Using the depth data captured and specific anchor points, a user can create a 3D map of a location in seconds. This feature will allow more immersive AR experiences and give the application a better understanding of the environment for object placement and occlusion.

Face and hand tracking have also been added in ARKit, which will allow for more advanced AR games and filters that utilize body tracking. This could be used to translate sign language live or attach AR objects to a person for a game like laser tag.

App Clips

Apple announced the rumored App Clips and it aims to ease the friction of using some commerce apps out in the world. 

App Clips use images like QR codes to reveal content, which could be useful for a wearable AR device App Clips use images like QR codes to reveal content, which could be useful for a wearable AR device

With iOS 14, users can tap an NFC sticker, click a link, or scan a special QR code to access a "Clip." These App Clips are lightweight portions of an app, required to be less than 10MB, and will show up as a floating card on your device. From there, you can use Sign-in with Apple and Apple pay to complete a transaction in moments, all without downloading an app.

The specialized QR codes are the most notable here, as Apple could be scattering these around the world for use with App Clips on the iPhone now only to have them work with "Apple Glass" in the future. The premise would be the same in AR—walk up to a QR code and see an AR object you can interact with to purchase without downloading an entire app.

HomeKit Secure Video Face Recognition

HomeKit Secure Video, which was added to iOS in 2019, offers some smart features for users, such as recognizing objects that appear in videos for easier searching through footage. In iOS 14, the security feature will gain a face classification function, allowing it to identify individual people when they approach the camera.

HomeKit Secure Video face recognition using saved faces in Photos could be a precursor to AR glasses technology HomeKit Secure Video face recognition using saved faces in Photos could be a precursor to AR glasses technology

Facial recognition data in the Photos app is used to make this function work, and Apple is applying it to people being recorded in real-time. While "Apple Glass" is not expected to have a camera, it will have LiDAR and other sensors that could create enough data to determine who a person is.

One of the earliest features users desired of an AR headset is instant face recognition to remind them of names or important information. This has wide-ranging privacy issues on its own, but if the data set is limited to what a user has on their iPhone, it is much more safe and useful.

Apple Glass and iPhone

Companies have been in search of the next big thing since the iPhone launched in 2008. Apple has been one of the companies trying the hardest to surpass its own success with products like Apple Watch and Apple Vision Pro.

iPhone 15 Pro Max iPhone 15 Pro Max

Ultimately, there's no way to know if Apple Glass could actually surpass iPhone in terms of ubiquity and functionality, but it seems to be a possible end goal. However, iPhone will play a significant role in ensuring it is eventually surpassed.

At the very start of visionOS and Apple Vision Pro's life, Apple promoted iPhone as a tool to record Spatial Video that can only be viewed on-device. The relationship between the products will likely increase with future updates.

As previously described, Apple Glass may initially rely on the iPhone for off-device processing. And with the advent of Apple Intelligence, there's a good chance that initial versions of the product could pass complex operations to iPhone.

"Apple Glass" Price and Release

Since Apple revealed its Spatial Computing platform, Ming-Chi Kuo has shifted release timelines. Focus is on the headset and glasses are being pushed out years.

The Vision Pro will pave the way for developers, while a second-generation headset is expected in 2026 at the earliest. This second model would be lighter and have more powerful processors.

Ultimately this would lead to a lightweight set of "Apple Glass" that could be worn for AR or MR experiences. The release window could be 2030 or later, however, given the constraints of Apple Vision Pro. As for price, it could be astronomical given Vision Pro's $3,499 starting price, though technology maturation and supply chain efficiency could reduce that in a few years.


Apple Glass Related Stories