Rumors of an Apple-branded virtual reality (VR) or augmented reality (AR) solution have floated for years, with evidence of Apple's interest in the market continuing to surface and evolve over time. Numerous patents have been published, either created by Apple or absorbed from firms the company acquires, with ideas ranging from AR devices with transparent displays to VR headsets, with the additional use of an iPhone to power the technology in many cases. On the software side, Apple has launched ARKit, to help developers make AR applications for iOS devices.
● ARKit developer tools released
● Growing staff in VR/AR area
● Continues filing VR/AR patents
● CEO touts AR as "core" to Apple's future
● Apple sends regularly sends employees to Stanford's VR lab
● Continues VR/AR acquisitons: Emotient, Faceshift, PrimeSense
● Partners with EPGL for contact lens iOS apps
● AR mapping
● iPhone powered virtual displays
● VR headset
Apple is believed to be working on advanced virtual reality/augmented reality systems that could be incorporated into future iOS devices and/or hardware products. There aren’t any concrete details about the products or when they might launch, but the company’s focus in the area has increased over the past several months.
More recently, Apple has confirmed it is interested in augmented reality, by releasing ARKit to developers during WWDC 2017. Detailed below, ARKit is a platform that allows developers to more easily incorporate augmented reality into their apps.
Tim Cook said in a 2016 interview that Apple is “doing a lot of things” in the AR space and has called it a “core technology." He even hints at favoring it over VR. "There's virtual reality and there's augmented reality — both of these are incredibly interesting," Cook said in the interview. "But my own view is that augmented reality is the larger of the two, probably by far." In August, Cook said that "I think AR is extremely interesting and sort of a core technology," adding that "it's something we're doing a lot of things on behind that curtain that we talked about." Then in October, he again touted the benefits of AR over VR: "There's no substitute for human contact," Cook said. "And so you want the technology to encourage that."
In a February 2017 interview, Cook elaborated on his thoughts on AR, calling it a "big idea" concept, like a smartphone. Cook also suggested AR holds more promise than VR, as the latter "closes the world out" while AR keeps it all visible.
"The smartphone is for everyone, we don't have to think the iPhone is about a certain demographic, or country, or vertical market; it's for everyone. I think AR is that big, it's huge," said Cook. "I get excited because of the things that could be done that could improve a lot of lives. And be entertaining."
"I view AR like I view the silicon here in my iPhone, it's not a product per se, it's a core technology. But there are things to discover before that technology is good enough for the mainstream."
A report from the Financial Times in late March claims Apple is "stepping up its efforts" in augmented reality, allocating more resources to the project. It is said that AR has overtaken "Project Titan" — Apple's self-driving car initiative — as Apple's next big priority, behind the iPhone.
In an October 2017 interview with Cook by The Independent, the CEO said he continues to view AR as an important, but still gestating, technology that could influence many people and areas of life. Comparing AR to multitouch and other Apple technologies, Cook suggested AR could have an "exponential" growth similar to the creation of mobile apps.
When asked about rumored AR glasses or goggles, Cook stuck to the company line of not talking about anything currently being worked on, but does advise "I can tell you the technology itself doesn't exist to do that in a quality way. The display technology required, as well as putting enough stuff around your face — there's huge challenges with that."
"We don't give a rat's about being first, we want to be the best, and give people a great experience," said Cook. "But now anything you would see on the market any time soon would not be something any of us would be satisfied with. Nor do I think the vast majority of people would be satisfied."
What is virtual reality/augmented reality?
Simply put, virtual reality is a way for users to see a computer-generated world, freely able to look around by moving their head, usually with stereoscopic vision and a way to interact with the digital environment. Typically this technology uses a head-mounted display with a motion tracking mechanism to monitor the head's movement, and a similarly-tracked controller.
Virtual reality, as a concept, has been around for years, but has yet to see much commercial success. Companies tried, and failed, to make VR a socially-acceptable experience in the 90's, but these efforts were considered to be bulky, heavy, and difficult to use. For example, the Nintendo Virtual Boy, the Japanese game company's attempt at creating such a device for public consumption, was discontinued just one year after its release.
In the last few years, VR has undergone a renaissance, thanks in part to the miniaturization of technology, and the processing power of modern computing systems. Hardware such as the Oculus Rift and the HTC Vive make VR relatively accessible, though the lightweight hardware is still relatively prohibitive for most people unwilling to spend $1,000 on an early-generational technology.
The improvements in mobile technology has also led to a cheaper way to try out VR, using a smartphone. Google created Cardboard, a VR head-mounted display that uses cardboard and lenses with a smartphone's display to create a VR experience, while Samsung has pushed forward with its own Gear VR system that follows a similar path.
Augmented Reality is a markedly different concept to VR, in that instead of a completely virtual environment, the user instead sees the real world, albeit with computer-generated imagery partly obscuring their vision. This can be used to overlay data on a person's viewpoint, providing extra information about things they are looking at.
For example, this could include labels and supporting materials, cut-away side views of an object, or even translated signs overlaying those of a different language. The key here is that AR converges the real and virtual worlds into one singular view.
There are two main ways AR has developed, again in varying levels of cost and practicality. Some companies, such as Microsoft, have come up with headsets that users can see through, rather than looking at a tiny screen. While still in development, these are relatively inaccessible for most people due to the need for expensive hardware, and typically tethering the user to a specific room while it is being used.
Much like VR, there is a smartphone-based alternative. Using a combination of the rear camera of the smartphone and the screen, users can see “through” their mobile device to look at generated images. This concept has been available for a number of years, and is already employed in a number of different ways.
In Pokemon Go, players can see creatures they want to capture on their display, “existing” on a spot on the real-world landscape, with other games and apps performing similar tricks. Some apps can provide visitors to a museum more information about exhibits, or travelers can see road signs translated into their native language.
For this AR technique, all a user requires is a relevant app that offers AR functionality and a smartphone, something that the majority of likely users will already own, giving it an extremely low barrier for entry compared to the hardware-encumbered version of AR, and VR as a whole.
Revealed during WWDC, ARKit is Apple's developer toolset to help facilitate the creation of AR apps, which was then released as part of iOS 11.
The technology solves a number of issues surrounding the technology in a number of different ways, with the platform working with existing iOS devices, including iPhones and iPads, without any extra hardware.
ARKit uses its Visual Inertial Odometry (VIO) to accurately track the environment and the device's position using a combination of sensor data and CoreMotion data. These two inputs alone can allow an iPhone to report accurately its motion without additional calibration.
It is also possible for ARKit to analyze a scene in a camera's view to determine horizontal planes, such as floors and tables, which can be used to place virtual objects. Once detected, ARKit can also retain the location of the detected planes, even if they fall out of the camera's field of view temporarily.
The lighting situation of an environment is also monitored by ARKit, with the data able to be used to accurately light virtual objects, making the illusion of them appearing in the user's view more believable.
Developers were quick to take to ARKit, creating apps before its public availability to test out the framework. Notable examples include an app for measuring the dimensions of a room using the camera, and a remake of 80's band A-Ha's "Take on Me" music video.
More information can be found on AppleInsider's ARKit page.
External enclosure support helping VR development
During WWDC 2017, Apple revealed to developers that macOS High Sierra would include support for external Thunderbolt 3 enclosures, allowing users to use a secondary graphics card to boost the graphical capabilities of the attached Mac.
Alongside a live off-stage demonstration, Apple confirmed it had reached out to developers of popular VR platforms and engines, including Steam VR, Unreal, and Unity, to support VR content creation for Mac desktops, as well as support for Apple's next-generation graphics API, Metal 2.
Rather than providing a developer toolset for VR in a similar vein to ARKit, Apple has instead opted to provide hardware assistance to developers, in the form of its own external GPU developer's kit. Consisting of a Sonnet eGFX Breakaway Box 350, a Radeon RX 580 GPU, and a USB-C to USB-A hub, the kit costs developers $599 and requires the use of the beta release of High Sierra.
Notably, full support for external GPUs in High Sierra will arrive in the Spring of 2018. Aside from Apple's own implementation intended for developers, there are a number of other Thunderbolt 3 enclosures on the market, with some capable of working with supported Mac systems, albeit with effort required to get them operational.
It is also possible for users to add an Nvidia graphics card to their desktop instead of a Radeon card, with Nvidia releasing new drivers for Pascal-class cards in April 2017, ahead of Apple's announcement.
Of course, in order to test and consume VR content, developers and users will still have to acquire their own VR headset as a separate item. Since launch, hardware such as the Oculus Rift and HTC Vive have been expensive to purchase, though efforts such as Oculus' discounts throughout 2017 have helped to bring this cost down to a more manageable level, as low as $400.
Analysts say AR perfect partnership for Apple
KGI analyst Ming-Chi Kuo told investors in October 2016 he believed Apple’s aptitude for delivering innovative user experiences through human-machine interfaces will help the company move naturally into the AR space.
Just as iPod helped pave the way for the iPhone, the iPhone may be able to provide the necessary building blocks for a full-blown AR solution. Kuo didn’t provide details on what this might look like. One example might be Apple testing the waters with a system like the iOS game Pokemon Go, which uses the iPhone’s camera and display to provide users with a seamless AR experience.
In general, Kuo sees Apple integrating AR to redefine key product lines, perhaps leapfrogging competitors by three to five years. For example, augmented user interfaces could drastically change the way users interact with Apple Watch and Apple TV, eliminating obstacles like small screens and clunky controls.
At the same time, Apple might leverage AR tech to break into other fields, Kuo said. One such area of interest is automotive technology, or more specifically autonomous driving systems. Apple was widely rumored to be working on a self-driving car, dubbed "Project Titan," since March of 2015, but recent reports claim the company has abandoned those plans. Instead of a full-fledged car, Apple is scaling back its ambitious project to focus on underlying technology.
The notion that AR is one of the next big technologies to be embraced in Apple's products is also held by Steven Milunovich, an analyst for UBS. A note to investors in late February 2017 hyped the potential for AR, citing an interview with a developer suggesting it could make the current smartphone experience seem like "the dark ages."
"Thanks to advanced cameras, consumers will hold their phones up with images superimposed onto the screen in cars, rooms, or walking down the street," wrote Milunovich. "3D mapping through Simultaneous Localization and Mapping (SLAM) will be key."
Suggesting sources claim Apple has over 1,000 engineers in Israel working on AR-related technology, the analyst expects Apple to offer advanced AR applications in new ways, with supporting technology arriving in future iPhone iterations.
Milunovich believed Apple will slowly roll out the technology over the coming years, though could bypass the competition with a "superior user experience," possibly drawing more Android users over to iOS in the process. In the near future, the rumored upcoming "iPhone 8" is said to include "moderate 3D mapping using stereoscopic vision," while developers could be provided an AR software development kit by Apple as soon as this year.
AR glasses to debut in 2017, 2018
Rumors of Apple's intent to enter the augmented reality hardware space gained traction in January, as a report from AR/VR evangelist Robert Scoble claimed the company is partnering with optics manufacturer Carl Zeiss on a pair of lightweight augmented reality/mixed reality glasses.
Further, Scoble says the partnership explains why the Zeiss booth at CES 2017, located in the middle of the AR section, had no AR, VR or mixed reality optics to demonstrate. The theory is that Apple muzzled the company until the supposed tie-up is announced —or falls through.
Sources of the Financial Times suggest a release could be later than Scoble's claim, with a retail launch for an Apple-produced AR product more likely to take place in 2018 than this year, with an announcement not expected anytime soon.
Aside from ARKit for iOS devices, Apple has yet to publicly announce any AR-specific hardware intentions.
As of August, Apple is apparently still working on creating AR hardware, according to sources of the Financial Times. It is claimed multiple competing projects in the field are being developed internally, with concepts ranging from devices with integrated displays to setups similar to the Samsung Gear VR, which uses a smartphone as a VR display.
Refresh rates and the iPhone X
In March, developer Steven Troughton-Smith discovered code in a pre-release beta for iOS 10.3 that would allow an app to specify the device's screen refresh rate, potentially signifying a display with a higher refresh rate may be on the way in a future launch. While this can help improve the experience of using the Apple Pencil in a refreshed iPad Pro, it also has applications in VR.
During WWDC 2017, it was revealed the code was used to power the ProMotion feature of the 2017 iPad Pro displays. The function allowed the iPad Pro to change its refresh rate up to 120Hz, double the refresh rate of the previous generation, allowing for more fluid drawing with the Apple Pencil and reducing input latency down to 20ms.
While the feature has been revealed more for use in drawing and battery-saving applications during the event, the same technology could also be used to help Apple's VR and AR projects.
Virtual reality relies on having high refresh rate displays to make motion as fluid as possible. Lower refresh rates means fewer updates to the image the user sees, making movements seem choppier and destroying the "illusion" for the user.
Current iPhones use an LCD panel with a refresh rate of 60Hz, which is acceptable for the majority of smartphone uses. The iPhone X, introduced this fall, has an OLED display, a technology that has a far lower response time than LCD, and has the potential to be run at far higher refresh rates.
This switch to OLED theoretically makes the iPhone X an extremely good candidate for use with VR or AR, when used in a Cardboard or Gear VR-style system.
One other item that the iPhone X has that may help with VR and AR is the front-mounted 3D scanning system used for facial recognition. The technology is able to map the user's face and identify landmarks, which can then be used by an app for various functions. For example, it could be used in a Snapchat-style filter to more accurately place masks, make up, and other elements on a user's face in real time.
Apple virtual reality/augmented reality hiring
Two Apple hires in late 2016 suggested the company was getting serious about building out its own virtual and augmented reality technologies, though it had some catching up to do as Google, Facebook and Microsoft forge ahead with mature projects.
Zeyu Li, a former Magic Leap employee, joined Apple as a Senior Computer Vision Algorithm Engineer, reported Business Insider. According to Li's LinkedIn profile, he worked first as Lead 3D Engineer, then as Principal Engineer.
Apple's second hire came from Facebook's Oculus. Yury Petrov, who worked as a research scientist at the VR firm since 2013, took an identical position at Apple in June. Petrov's LinkedIn profile said his job at Oculus entailed “psychophysical and physiological studies of visual and multisensory experience of virtual reality (VR) including user experience factors in head-mounted displays (HMD).”
A report from March 2017 from sources of Bloomberg suggest Apple has filled out the rest of the team with high-profile individuals from a number of other major companies.
The person said to be heading up the team is Mike Rockwell, a 2015 hire who previously led the hardware and new technologies groups at Dolby. Rockwell is believed to be reporting to Dan Riccio, senior VP of Hardware Engineering.
Fletcher Rothkopf, one of the designers of the original Apple Watch, was allegedly assigned to work for Rockwell in spring 2016, alongside THX audio standard creator Tomlinson Holman.
Former lead engineer of Amazon's Lumberyard game engine Cody White is also said to be on the team, as well as Duncan McRoberts. McRoberts was previously a director of software development at Meta, a firm that produces high-end AR glasses.
Other members of the team are said to include iPhone, camera, and optical lens engineers, along with 3D animation veterans who previously worked on special effects for movies. Apple has reportedly managed to hire some employees away from Weta Digital, known for work on the "Lord of the Rings" films, with the new hires thought to be working from a new office in Wellington, New Zealand.
In April 2017, it was reported Apple had hired on Tim Dashwood, the creator of a number of plugins for video editing software that helped content creators produce media for VR headsets. It is likely that Apple brought Dashwood onboard to work on its AR and VR teams, possibly to help produce software in the field.
Dashwood is known for creating the 360VR Toolbox package, a $1,000 kit for making 360-degree videos specifically for Oculus Rift VR headsets, as well as other Final Cut Pro X plugins. Since his hiring, the plugins have been made available for free.
A report in April also claims Apple has tapped NASA for one member of staff for its AR team. Dr. Jeff Norris, an AR and VR specialist who founded the Mission Operations Innovation Office at NASA's JPL, is said to have been hired by Apple earlier in 2017, and is believed to have taken the role of senior manager on the AR team.
Norris has worked on a number of AR/VR and robotics solutions while at the JPL, with some featuring Microsoft's HoloLens goggles. Project Sidekick allowed astronauts on the International Space Station to communicate with home base, while OnSight allowed the virtual exploration of Mars, with the project featured in NASA's "Destination: Mars" museum exhibition.
The engineer also founded JPL Ops Lab, a part of the space agency that worked on developing human-system interfaces. The lab helped create ways to control robots with tablet interfaces, used motion tracking to manipulate robot arms, and the ability to interact with holograms.
Apple has made a number of key acquisitions in the virtual reality/augmented reality field that would further hint of developments in this realm.
Most recently, Apple purchased both Emotient - a company that builds tools for facial expression analysis — and Flyby Media in 2016. Emotient's tools are used to capture direct emotional response from customers, and has been used in marketing/advertising. It has also been tested in the medical setting to measure pain levels. Flyby focuses on augmented reality projects.
Apple has been mum about the Faceshift purchase, but has finally admitted to being behind the mysterious acquisition of the company but has declined to offer detail. The buyout helped Apple continue to build out its portfolio of facial recognition technologies. It’s been similarly quiet about the Metaio acquisition.
The PrimeSense acquisition sparked rumors that motion-based capabilities may be in store for Apple TV, as well an iPad app for 3D printing. PrimeSense's 3D depth technology and motion sensing capabilities were used in Microsoft Kinect's platform.
In June 2017, reports surfaced claiming Apple had acquired German eye tracking hardware producer SensoMotoric Instruments. Technology from the company is used in an array of applications, ranging from augmented reality to medical, such as early autism detection in children, brain mapping, and neurology.
SensoMotoric's main eye tracking technology can be used to monitor the wearer's gaze in real-time, at up to 120 times per second. This technology has the potential to reduce input lag, an issue that causes AR and VR users to suffer from motion sickness, caused through a mismatch between the user's shift in perspective and their perception of movement.
In addition to ongoing in-house research and development, Apple holds a variety of patents covering a gamut of augmented reality applications, including transparent displays, mapping solutions and iPhone-powered virtual displays.
In November 2016, Apple obtained a patent detailing an augmented reality mapping system that harnesses iPhone hardware to overlay visual enhancements onto live video, lending credence to recent rumors suggesting the company plans to implement an iOS-based AR strategy in the near future.
Apple's U.S. Patent No. 9,488,488 for "Augmented reality maps" describes a mapping app capable of tapping into iPhone's advanced sensor suite to present users with real-time augmented views of their surrounding environment.
Apple was also granted a patent detailing a method of device localization — mapping — using computer vision and inertial measurement sensors, one of the first inventions to be reassigned from the acquisition of AR reality startup Flyby Media.
Two patents that surfaced in January 2017 stem from Apple's acquisition of German AR firm Metaio. One relates to the hardware framework for an AR device with enhanced computer vision capabilities, with power-efficient object recognition being a main focus for the patent.
The second, a method for "representing virtual information in a real environment," details a way to label points of interest in an AR environment, taking occlusion perception into account. Using a combination of depth sensing and positioning data, Apple's system would be able to show only labels for points of interest that can be seen by the user, hiding those for places out of view behind a wall or a building, for example.
In April 2017, the U.S. Patent and Trademark Office published a patent application from Apple revealing one potential use for AR. Describing a "Method and device for illustratiing a virtual object in a real environment," the application effectively covers how to faithfully represent a digital object in an image or video feed.
The technique uses the camera to capture a two-dimensional image of the environment, and then works out the camera's position relative to at least one in-image component. Working from that, the system then collects three-dimensional image and spatial information, such as the position of wall and floor planes, using a combination of depth mapping, radar, stereo cameras, and other techniques,
With this data, the system is then able to, within a defined area, superimpose the virtual objects into the scene, to show how it would fit in with other objects in an area. This same system can also be usd to "remove" real-world objects from the image, potentially giving more space for the virtual image to sit.
This sort of object removal and insertion lends itself handily to furniture sales, allowing homeowners to see what a new sofa or table would look like in a room, while "taking out" the old items. There is also the possibility of using it with semi-transparent displays, with other applications including within head-mounted displays and for AR-based navigation systems within cars.
Unearthed in late July, the patent application for a "Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor" shows Apple has considered ways to offer valuable data to users via augmented reality.
The patent application mentions the potential use of AR for showing points of interest (POI) information, overlaid on a real-world view from the mobile device's camera. Apple's first-party apps already include POI data for some of theri features, such as Maps searches for nearby stores and restaurants, so this would most likely extend the use of that information in a new way.
According to the document, the device would first capture images or a video feed of the environment with an onboard rear camera, then by using the onboard GPS for geolocation data, uses the location as a starting point for finding POIs. Using cached data saved on the device or by querying servers, the app can then display indicators in the real-world view of POIs that are relevant or can be seen from the camera's viewpoint.
This data would be put in the correct place by image-processing algorithms or by other sensors, including depth-sensing cameras, with the overlaid information anchored to their designated "real-world" placement regardless of the user's movement of the camera.
The application also mentions the possibility of providing different data depending on the orientation of the device, such as providing more navigation functionality when held vertically, or photographic or video recording-based features when held horizontally.
There are also other descriptions for the system's use, such as being used in a head-mounted display with a semi-transparent screen, with users interacting with the data by pointing with their hands at the right spot in front of the user.
EPGL & Apple parntership for AR iOS apps
Medical supply company EPGL in conjunction with Apple is utilizing its intellectual property to develop iOS apps to project an image on a contact lens oriented around the perimeters of a contact lens for use in AR applications.
The app requires low power, can be adjusted quickly, and can be incorporated into the elastic material of a contact lens. The lenses may utilize a prism to redirect the image onto the retina, potentially aiding those with vision cuts, where part of the users’s vision is absent or restricted due to stroke or another malady.
This AR tech avoid the stigma of bulkier apparatuses like Google Glass, which was banned in some places because of the possibility of covert surveillance by a wearer.
A report from an Apple Environment Health and Safety contractor, leaked to Apple staff in April, may be proof that the company is working on AR glasses or a VR headset. The report is said to have detailed a number of eye-related injuries, caused through testing out a new prototype.
An incident on February 21 involved "medical treatment beyond first aid" for a person at Apple's De Anza office in Cupertino, Calif., after the user advised to the study lead she experienced discomfort in her eye, and "was able to see a laser flash at several points during the study." The study lead then secured the prototype for analysis, while the user was referred to an optometrist.
A second issue on March 2 at the company's Vallco Parkway office in Cupertino also involved an employee's complaints about eye pain after working on a prototype, with the pain possibly "associated with use" of the hardware, the report states. The employee "noticed that the security seal on the magenta (outer) case had been broken and had thought the unit may have been tampered with."
While the injuries could have been suffered by a number of different technologies that interact with a user's eyes, such as iris scanning or 3D facial recognition, one source of Gizmodo within Apple suggested it could be linked to Apple's AR work.
Why an Apple virtual headset is unlikely to arrive soon
Although patents may hint otherwise, an Apple virtual headset is unlikely to arrive in the near future. Instead, Apple will more likely create a platform where developers can tap into its hardware/software to create VR experiences. This could mean simple apps or connected headsets, etc.
Piper Jaffray analyst Gene Munster believes an iOS ecosystem support might be ripe for launch as soon as 2018 due to the aforementioned acquisitions, hires and serious assets earmarked specifically for AR/VR research and development. There's a natural progression from current cutting-edge personal technology — smartphones — to AR/VR devices, which could see mass adoption as wearable devices priced in line with modern handsets. Munster believes Apple is currently looking at VR like it does the Apple Watch, which is to say a peripheral for iPhone. However, he doesn't see the company releasing its own hardware at least not in the near term.
Sending employees to Stanford's virtual reality lab
It's been revealed on a couple of occasions that Apple has taken interest in Stanford University's Virtual Human Interaction Lab, sending representatives to visit the facility at least three times in three months earlier in 2016. Employees were put through immersive VR experiences, including a project that aims to teach empathy through forced perspective virtual reality interventions. For example, a male subject entering the VR world might be given a female persona and exposed to prejudice.
Modern virtual reality and augmented reality technologies aren't perfect. The major source of user physical illness in the technology is illness induced by input lag. Apple's tight integration of software and hardware down to the iPhone's casing size can do a great deal to eliminate problems inherent with both AR and VR technology. Input lag can be minimized by leveraging Apple's strict control over the sensors used in a device, as well as managing the communication between the sensors and SDKs — much like Xcode does now for iOS.
Much of the work that Apple needs to do is simply refinement of existing technologies. If Apple should utilize the open source nature of the HTC Vive for positional tracking in a future full-VR implementation, both the Apple VR and Windows-based VR ecosystem can flourish.
While Apple was the first to market with a PC, it didn't set the standard — IBM did that in 1981. Apple wasn't the first to release a MP3 player, but it did it better, and won the market in the end. Samsung released its smart watch a year before the Apple Watch came out, and in every regard, the Apple Watch is the superior product, with Samsung floundering with multiple models and operating systems.
Mac Pro Trademark Update
In late April 2017, Apple had updated its trademark in Hong Kong for the Mac Pro, adding in a number of extra terms the mark applies to. Terms including "wireless communication devices" and "home theatre systems" were included on the list, terms which already exist for the Mac mini and iMac trademarks in the country.
Notably, the list also includes "augmented reality displays, googles, controllers, and headsets; 3D spectacles." This may be a sign Apple is anticipating some application of AR technology with the new device, though it is not a guarantee that Apple will do so in the future.
Lumus, Quanta, and Catcher
A licensing agreement between Apple producer Quanta and AR parts maker Lumus was revealed on December 4. The deal paves the way for the potential manufacturing of an Apple-produced AR headset.
Under the deal, Quanta will make lenses for Lumus, though there is also the option for it to produce components for other technology companies, meaning it may not necessarily involve Apple. Even so, Quanta's involvement does strongly suggest Apple will benefit from the licensing at some point, though Lumus CEO Ari Grobman refused to confirm if it had any connection to Apple.
Grobman advised the deal will help make the "most expensive key enabling technology" for AR glasses more affordable for manufacturers, and in turn bringing down the cost of the hardware for consumers. "Wuanta has suggested that full AR headsets would be priced for less than the cost of a high-end cell phone," advised Grobman.
Two days later, long-time chassis supplier of Apple, Catcher Technology confirmed it is looking at products outside its normal range. Though chairman Allen Horng did reveal in December that it was going to supply chassis for a new product category, he stopped short of advising what that could be.
Market observers believe this new product could be in the AR and VR field, as an obvious progression from its smartphone and notebook chassis production. Apple's apparent interest in producing an AR headset or a similar device has led analysts to suggest Catcher could be working with the company on its components.