Get the Lowest Prices anywhere on Macs, iPads and Apple Watches: Apple Price Guides updated July 24th
 

Apple is believed to be working on virtual reality/augmented reality that could be incorporated into future iOS devices and/or hardware products. There aren’t any concrete details about the products or when they might launch, but the company’s focus in the area has increased over the past several months.

More recently, Apple has confirmed it is interested in augmented reality, by releasing ARKit to developers during WWDC 2017. 

Tim Cook said in a 2016 interview that Apple is “doing a lot of things” in the AR space and has called it a “core technology." He even hints at favoring it over VR. "There's virtual reality and there's augmented reality — both of these are incredibly interesting," Cook said in the interview. "But my own view is that augmented reality is the larger of the two, probably by far." In August, Cook said that "I think AR is extremely interesting and sort of a core technology," adding that "it's something we're doing a lot of things on behind that curtain that we talked about." Then in October, he again touted the benefits of AR over VR: "There's no substitute for human contact," Cook said. "And so you want the technology to encourage that."

In a February 2017 interview, Cook elaborated on his thoughts on AR, calling it a "big idea" concept, like a smartphone. Cook also suggested AR holds more promise than VR, as the latter "closes the world out" while AR keeps it all visible. 

"The smartphone is for everyone, we don't have to think the iPhone is about a certain demographic, or country, or vertical market; it's for everyone. I think AR is that big, it's huge," said Cook. "I get excited because of the things that could be done that could improve a lot of lives. And be entertaining." 

"I view AR like I view the silicon here in my iPhone, it's not a product per se, it's a core technology. But there are things to discover before that technology is good enough for the mainstream." 

A report from the Financial Times in late March claims Apple is "stepping up its efforts" in augmented reality, allocating more resources to the project. It is said that AR has overtaken "Project Titan" -- Apple's self-driving car initiative -- as Apple's next big priority, behind the iPhone.  

 

Apple's Apple's "Visual-based inertial navigation" patent

What is virtual reality/augmented reality?

Simply put, virtual reality is a way for users to see a computer-generated world, freely able to look around by moving their head, usually with stereoscopic vision and a way to interact with the digital environment. Typically this technology uses a head-mounted display with a motion tracking mechanism to monitor the head's movement, and a similarly-tracked controller.

Virtual reality, as a concept, has been around for years, but has yet to see much commercial success. Companies tried, and failed, to make VR a socially-acceptable experience in the 90's, but these efforts were considered to be bulky, heavy, and difficult to use. For example, the Nintendo Virtual Boy, the Japanese game company's attempt at creating such a device for public consumption, was discontinued just one year after its release.

In the last few years, VR has undergone a renaissance, thanks in part to the miniaturization of technology, and the processing power of modern computing systems. Hardware such as the Oculus Rift and the HTC Vive make VR relatively accessible, though the lightweight hardware is still relatively prohibitive for most people unwilling to spend $1,000 on an early-generational technology.

The improvements in mobile technology has also led to a cheaper way to try out VR, using a smartphone. Google created Cardboard, a VR head-mounted display that uses cardboard and lenses with a smartphone's display to create a VR experience, while Samsung has pushed forward with its own Gear VR system that follows a similar path.

 

 

Augmented Reality is a markedly different concept to VR, in that instead of a completely virtual environment, the user instead sees the real world, albeit with computer-generated imagery partly obscuring their vision. This can be used to overlay data on a person's viewpoint, providing extra information about things they are looking at.

For example, this could include labels and supporting materials, cut-away side views of an object, or even translated signs overlaying those of a different language. The key here is that AR converges the real and virtual worlds into one singular view.

There are two main ways AR has developed, again in varying levels of cost and practicality. Some companies, such as Microsoft, have come up with headsets that users can see through, rather than looking at a tiny screen. While still in development, these are relatively inaccessible for most people due to the need for expensive hardware, and typically tethering the user to a specific room while it is being used.

Much like VR, there is a smartphone-based alternative. Using a combination of the rear camera of the smartphone and the screen, users can see “through” their mobile device to look at generated images. This concept has been available for a number of years, and is already employed in a number of different ways.

In Pokemon Go, players can see creatures they want to capture on their display, “existing” on a spot on the real-world landscape, with other games and apps performing similar tricks. Some apps can provide visitors to a museum more information about exhibits, or travelers can see road signs translated into their native language.

For this AR technique, all a user requires is a relevant app that offers AR functionality and a smartphone, something that the majority of likely users will already own, giving it an extremely low barrier for entry compared to the hardware-encumbered version of AR, and VR as a whole.

 

ARKit

Revealed at WWDC 2017, ARKit is a developer toolset that can allow developers to easily create augmented reality features for their apps, using existing Apple hardware. Demonstrated on stage, the framework will work with iOS devices, with iPads used to show off the capabilities during the event. 

A scene from Wingnut AR, a Peter Jackson company, was depicted on a table from the viewpoint of a held iPad. As the scene developed, the diorama stayed in the same place on the table top while the iPad moved around. Live gameplay powered by Unreal Engine 4 was also featured. 

It appears that ARKit will be limited to only newer iOS devices at first, with Apple's developer information specifies compatibility with the A9 and A10 families of processors. This effectively makes the ARKit compatible with the iPhone 6s, iPhone 7, and iPhone SE ranges, as well as the iPad Pro lines and the 2017 iPad. 

The iOS 11 release as announced supposedly supports the iPhone 5S and newer, the iPad Air and newer, and the latest iPod Touch, suggesting ARKit will expand its compatibility in the future. 

According to Apple's developer pages, ARKit will use Visual Inertial Odometry (VIO) to accurately track the world, combining camera sensor data with CoreMotion data. It is claimed these two inputs will allow the iOS device to sense how it moves within a room with a high degree of accuracy, without additional calibration. 

ARKit can also analyze a scene presented by the camera view and find horizontal planes in a room. Capable of detecting table surfaces and floor space, it can be used to track and place objects in areas, with its light estimation facility able to apply the correct lighting to virtual objects, to make them fit the real-world scene more closely. 

The developer response to ARKit's tools is "unbelievable," according to Apple VP of worldwide iPod, iPhone, and iOS product marketing Greg Joswiak in a late June interview. Noting quickly developed projects ranging from virtual measuring tapes to an Ikea shopping ap, Joswiak said "It's absolutely incredible what people are doing in so little time." 

"I think there is a gigantic runway that we have here with the iPhone and the iPad. The fact we have a billion of these devices out there is quite an opportunity for developers," said Joswiak. "Who knows the kind of things coming down the road, but whatever those things are, we're going to start at zero." 

External enclosure support helping VR development

During WWDC 2017, Apple revealed to developers that macOS High Sierra would include support for external Thunderbolt 3 enclosures, allowing users to use a secondary graphics card to boost the graphical capabilities of the attached Mac

Alongside a live off-stage demonstration, Apple confirmed it had reached out to developers of popular VR platforms and engines, including Steam VR, Unreal, and Unity, to support VR content creation for Mac desktops, as well as support for Apple's next-generation graphics API, Metal 2. 

Rather than providing a developer toolset for VR in a similar vein to ARKit, Apple has instead opted to provide hardware assistance to developers, in the form of its own external GPU developer's kit. Consisting of a Sonnet eGFX Breakaway Box 350, a Radeon RX 580 GPU, and a USB-C to USB-A hub, the kit costs developers $599 and requires the use of the beta release of High Sierra. 

Notably, full support for external GPUs in High Sierra will arrive in the Spring of 2018. Aside from Apple's own implementation intended for developers, there are a number of other Thunderbolt 3 enclosures on the market, with some capable of working with supported Mac systems, albeit with effort required to get them operational. 

It is also possible for users to add an Nvidia graphics card to their desktop instead of a Radeon card, with Nvidia releasing new drivers for Pascal-class cards in April 2017, ahead of Apple's announcement. 

Of course, in order to test and consume VR content, developers and users will still have to acquire their own VR headset as a separate item. Since launch, hardware such as the Oculus Rift and HTC Vive have been expensive to purchase, though efforts such as Oculus' discounts throughout 2017 have helped to bring this cost down to a more manageable level, as low as $400. 

 

Analysts say AR perfect partnership for Apple

KGI analyst Ming-Chi Kuo told investors in October 2016 he believed Apple’s aptitude for delivering innovative user experiences through human-machine interfaces will help the company move naturally into the AR space. 

Just as iPod helped pave the way for the iPhone, the iPhone may be able to provide the necessary building blocks for a full-blown AR solution. Kuo didn’t provide details on what this might look like. One example might be Apple testing the waters with a system like the iOS game Pokemon Go, which uses the iPhone’s camera and display to provide users with a seamless AR experience. 

In general, Kuo sees Apple integrating AR to redefine key product lines, perhaps leapfrogging competitors by three to five years. For example, augmented user interfaces could drastically change the way users interact with Apple Watch and Apple TV, eliminating obstacles like small screens and clunky controls.

At the same time, Apple might leverage AR tech to break into other fields, Kuo said. One such area of interest is automotive technology, or more specifically autonomous driving systems. Apple was widely rumored to be working on a self-driving car, dubbed "Project Titan," since March of 2015, but recent reports claim the company has abandoned those plans. Instead of a full-fledged car, Apple is scaling back its ambitious project to focus on underlying technology. 

The notion that AR is one of the next big technologies to be embraced in Apple's products is also held by Steven Milunovich, an analyst for UBS. A note to investors in late February 2017 hyped the potential for AR, citing an interview with a developer suggesting it could make the current smartphone experience seem like "the dark ages."

"Thanks to advanced cameras, consumers will hold their phones up with images superimposed onto the screen in cars, rooms, or walking down the street," wrote Milunovich. "3D mapping through Simultaneous Localization and Mapping (SLAM) will be key."

Suggesting sources claim Apple has over 1,000 engineers in Israel working on AR-related technology, the analyst expects Apple to offer advanced AR applications in new ways, with supporting technology arriving in future iPhone iterations. 

Milunovich believed Apple will slowly roll out the technology over the coming years, though could bypass the competition with a "superior user experience," possibly drawing more Android users over to iOS in the process. In the near future, the rumored upcoming "iPhone 8" is said to include "moderate 3D mapping using stereoscopic vision," while developers could be provided an AR software development kit by Apple as soon as this year. 

 

AR glasses to debut in 2017, 2018

Rumors of Apple's intent to enter the augmented reality hardware space gained traction in January, as a report from AR/VR evangelist Robert Scoble claimed the company is partnering with optics manufacturer Carl Zeiss on a pair of lightweight  augmented reality/mixed reality glasses. 

Further, Scoble says the partnership explains why the Zeiss booth at CES 2017, located in the middle of the AR section, had no AR, VR or mixed reality optics to demonstrate. The theory is that Apple muzzled the company until the supposed tie-up is announced —or falls through. 

Sources of the Financial Times suggest a release could be later than Scoble's claim, with a retail launch for an Apple-produced AR product more likely to take place in 2018 than this year, with an announcement not expected anytime soon. 

Aside from ARKit for iOS devices, Apple has yet to publicly announce any AR-specific hardware intentions. 

 

Refresh rates and the "iPhone 8"

In March, developer Steven Troughton-Smith discovered code in a pre-release beta for iOS 10.3 that would allow an app to specify the device's screen refresh rate, potentially signifying a display with a higher refresh rate may be on the way in a future launch. While this can help improve the experience of using the Apple Pencil in a refreshed iPad Pro, it also has applications in VR. 

During WWDC 2017, it was revealed the code was used to power the ProMotion feature of the 2017 iPad Pro displays. The function allowed the iPad Pro to change its refresh rate up to 120Hz, double the refresh rate of the previous generation, allowing for more fluid drawing with the Apple Pencil and reducing input latency down to 20ms. 

While the feature has been revealed more for use in drawing and battery-saving applications during the event, the same technology could also be used to help Apple's VR and AR projects. 

Virtual reality relies on having high refresh rate displays to make motion as fluid as possible. Lower refresh rates means fewer updates to the image the user sees, making movements seem choppier and destroying the "illusion" for the user. 

Current iPhones use an LCD panel with a refresh rate of 60Hz, which is acceptable for the majority of smartphone uses. The rumored "iPhone 8," expected to be introduced this fall, is currently thought to have an OLED display, a technology that has a far lower response time than LCD, and has the potential to be run at far higher refresh rates. 

This switch to OLED theoretically makes the "iPhone 8" an extremely good candidate for use with VR or AR, when used in a Cardboard or Gear VR-style system. 

One other item that the "iPhone 8" has that may help with VR and AR is the rumored laser-based 3D scanning system supposedly being used for facial recognition. The technology, effectively a miniaturized LIDAR mapper or rangefinder, could also be used to scan the environment if it is also mounted to the back of the device. 

For VR, scanning the local area potentially allows the user to "see" potential obstructions while their view is obscured by a head-mounted display accessory. More likely would be its use in AR, which could help give apps direct positioning data for items, allowing for labels and other interactive elements to be correctly placed "on" an object of interest. 

 

Apple virtual reality/augmented reality hiring 

Two Apple hires in late 2016 suggested the company was getting serious about building out its own virtual and augmented reality technologies, though it had some catching up to do as Google, Facebook and Microsoft forge ahead with mature projects.

Zeyu Li, a former Magic Leap employee, joined Apple as a Senior Computer Vision Algorithm Engineer, reported Business Insider. According to Li's LinkedIn profile, he worked first as Lead 3D Engineer, then as Principal Engineer.

Apple's second hire came from Facebook's Oculus. Yury Petrov, who worked as a research scientist at the VR firm since 2013, took an identical position at Apple in June. Petrov's LinkedIn profile said his job at Oculus entailed “psychophysical and physiological studies of visual and multisensory experience of virtual reality (VR) including user experience factors in head-mounted displays (HMD).”

A report from March 2017 from sources of Bloomberg suggest Apple has filled out the rest of the team with high-profile individuals from a number of other major companies. 

The person said to be heading up the team is Mike Rockwell, a 2015 hire who previously led the hardware and new technologies groups at Dolby. Rockwell is believed to be reporting to Dan Riccio, senior VP of Hardware Engineering. 

Fletcher Rothkopf, one of the designers of the original Apple Watch, was allegedly assigned to work for Rockwell in spring 2016, alongside THX audio standard creator Tomlinson Holman. 

Former lead engineer of Amazon's Lumberyard game engine Cody White is also said to be on the team, as well as Duncan McRoberts. McRoberts was previously a director of software development at Meta, a firm that produces high-end AR glasses. 

Other members of the team are said to include iPhone, camera, and optical lens engineers, along with 3D animation veterans who previously worked on special effects for movies. Apple has reportedly managed to hire some employees away from Weta Digital, known for work on the "Lord of the Rings" films, with the new hires thought to be working from a new office in Wellington, New Zealand. 

In April 2017, it was reported Apple had hired on Tim Dashwood, the creator of a number of plugins for video editing software that helped content creators produce media for VR headsets. It is likely that Apple brought Dashwood onboard to work on its AR and VR teams, possibly to help produce software in the field.

Dashwood is known for creating the 360VR Toolbox package, a $1,000 kit for making 360-degree videos specifically for Oculus Rift VR headsets, as well as other Final Cut Pro X plugins. Since his hiring, the plugins have been made available for free.  

A report in April also claims Apple has tapped NASA for one member of staff for its AR team. Dr. Jeff Norris, an AR and VR specialist who founded the Mission Operations Innovation Office at NASA's JPL, is said to have been hired by Apple earlier in 2017, and is believed to have taken the role of senior manager on the AR team. 

Norris has worked on a number of AR/VR and robotics solutions while at the JPL, with some featuring Microsoft's HoloLens goggles. Project Sidekick allowed astronauts on the International Space Station to communicate with home base, while OnSight allowed the virtual exploration of Mars, with the project featured in NASA's "Destination: Mars" museum exhibition. 

The engineer also founded JPL Ops Lab, a part of the space agency that worked on developing human-system interfaces. The lab helped create ways to control robots with tablet interfaces, used motion tracking to manipulate robot arms, and the ability to interact with holograms. 

 

Acquisitions 

Apple has made a number of key acquisitions in the virtual reality/augmented reality field that would further hint of developments in this realm.

Most recently, Apple purchased both Emotient - a company that builds tools for facial expression analysis -- and Flyby Media in 2016. Emotient's tools are used to capture direct emotional response from customers, and has been used in marketing/advertising. It has also been tested in the medical setting to measure pain levels. Flyby focuses on augmented reality projects. 

Apple also acquired motion capture specialist Faceshift and German AR firm Metaio in 2015, as well as PrimseSense in 2013.

Apple has been mum about the Faceshift purchase, but has finally admitted to being behind the mysterious acquisition of the company but has declined to offer detail. The buyout helped Apple continue to build out its portfolio of facial recognition technologies. It’s been similarly quiet about the Metaio acquisition. 

The PrimeSense acquisition sparked rumors that motion-based capabilities may be in store for Apple TV, as well an iPad app for 3D printing. PrimeSense's 3D depth technology and motion sensing capabilities were used in Microsoft Kinect's platform. 

In June 2017, reports surfaced claiming Apple had acquired German eye tracking hardware producer SensoMotoric Instruments. Technology from the company is used in an array of applications, ranging from augmented reality to medical, such as early autism detection in children, brain mapping, and neurology. 

SensoMotoric's main eye tracking technology can be used to monitor the wearer's gaze in real-time, at up to 120 times per second. This technology has the potential to reduce input lag, an issue that causes AR and VR users to suffer from motion sickness, caused through a mismatch between the user's shift in perspective and their perception of movement. 

 

Patents 

In addition to ongoing in-house research and development, Apple holds a variety of patents covering a gamut of augmented reality applications, including transparent displays, mapping solutions and iPhone-powered virtual displays.

In November 2016, Apple obtained a patent detailing an augmented reality mapping system that harnesses iPhone hardware to overlay visual enhancements onto live video, lending credence to recent rumors suggesting the company plans to implement an iOS-based AR strategy in the near future. 

Apple's U.S. Patent No. 9,488,488 for "Augmented reality maps" describes a mapping app capable of tapping into iPhone's advanced sensor suite to present users with real-time augmented views of their surrounding environment. 

Apple was also granted a patent detailing a method of device localization -- mapping -- using computer vision and inertial measurement sensors, one of the first inventions to be reassigned from the acquisition of AR reality startup Flyby Media

Two patents that surfaced in January 2017 stem from Apple's acquisition of German AR firm Metaio. One relates to the hardware framework for an AR device with enhanced computer vision capabilities, with power-efficient object recognition being a main focus for the patent. 

The second, a method for "representing virtual information in a real environment," details a way to label points of interest in an AR environment, taking occlusion perception into account. Using a combination of depth sensing and positioning data, Apple's system would be able to show only labels for points of interest that can be seen by the user, hiding those for places out of view behind a wall or a building, for example. 

In April 2017, the U.S. Patent and Trademark Office published a patent application from Apple revealing one potential use for AR. Describing a "Method and device for illustratiing a virtual object in a real environment," the application effectively covers how to faithfully represent a digital object in an image or video feed. 

The technique uses the camera to capture a two-dimensional image of the environment, and then works out the camera's position relative to at least one in-image component. Working from that, the system then collects three-dimensional image and spatial information, such as the position of wall and floor planes, using a combination of depth mapping, radar, stereo cameras, and other techniques, 

With this data, the system is then able to, within a defined area, superimpose the virtual objects into the scene, to show how it would fit in with other objects in an area. This same system can also be usd to "remove" real-world objects from the image, potentially giving more space for the virtual image to sit. 

This sort of object removal and insertion lends itself handily to furniture sales, allowing homeowners to see what a new sofa or table would look like in a room, while "taking out" the old items. There is also the possibility of using it with semi-transparent displays, with other applications including within head-mounted displays and for AR-based navigation systems within cars. 

 

EPGL & Apple parntership for AR iOS apps

Medical supply company EPGL in conjunction with Apple is utilizing its intellectual property to develop iOS apps to project an image on a contact lens oriented around the perimeters of a contact lens for use in AR applications. 

The app requires low power, can be adjusted quickly, and can be incorporated into the elastic material of a contact lens. The lenses may utilize a prism to redirect the image onto the retina, potentially aiding those with vision cuts, where part of the users’s vision is absent or restricted due to stroke or another malady. 

This AR tech avoid the stigma of bulkier apparatuses like Google Glass, which was banned in some places because of the possibility of covert surveillance by a wearer.  

 

Testing injuries

A report from an Apple Environment Health and Safety contractor, leaked to Apple staff in April, may be proof that the company is working on AR glasses or a VR headset. The report is said to have detailed a number of eye-related injuries, caused through testing out a new prototype. 

An incident on February 21 involved "medical treatment beyond first aid" for a person at Apple's De Anza office in Cupertino, Calif., after the user advised to the study lead she experienced discomfort in her eye, and "was able to see a laser flash at several points during the study." The study lead then secured the prototype for analysis, while the user was referred to an optometrist. 

A second issue on March 2 at the company's Vallco Parkway office in Cupertino also involved an employee's complaints about eye pain after working on a prototype, with the pain possibly "associated with use" of the hardware, the report states. The employee "noticed that the security seal on the magenta (outer) case had been broken and had thought the unit may have been tampered with." 

While the injuries could have been suffered by a number of different technologies that interact with a user's eyes, such as iris scanning or 3D facial recognition, one source of Gizmodo within Apple suggested it could be linked to Apple's AR work. 

 

Why an Apple virtual headset is unlikely 

Although patents may hint otherwise, an Apple virtual headset is unlikely. They're too niche. Instead, Apple will more likely create a platform where developers can tap into its hardware/software to create VR experiences. This could mean simple apps or connected headsets, etc.

Piper Jaffray analyst Gene Munster believes an iOS ecosystem support might be ripe for launch as soon as 2018 due to the aforementioned acquisitions, hires and serious assets earmarked specifically for AR/VR research and development. There's a natural progression from current cutting-edge personal technology — smartphones — to AR/VR devices, which could see mass adoption as wearable devices priced in line with modern handsets. Munster believes Apple is currently looking at VR like it does the Apple Watch, which is to say a peripheral for iPhone. However, he doesn't see the company releasing its own hardware at least not in the near term.

 

Sending employees to Stanford's virtual reality lab

It's been revealed on a couple of occasions that Apple has taken interest in Stanford University's Virtual Human Interaction Lab, sending representatives to visit the facility at least three times in three months earlier in 2016. Employees were put through immersive VR experiences, including a project that aims to teach empathy through forced perspective virtual reality interventions. For example, a male subject entering the VR world might be given a female persona and exposed to prejudice. 

 

Future possibilites 

Modern virtual reality and augmented reality technologies aren't perfect. The major source of user physical illness in the technology is illness induced by input lag. Apple's tight integration of software and hardware down to the iPhone's casing size can do a great deal to eliminate problems inherent with both AR and VR technology. Input lag can be minimized by leveraging Apple's strict control over the sensors used in a device, as well as managing the communication between the sensors and SDKs — much like Xcode does now for iOS.

Much of the work that Apple needs to do is simply refinement of existing technologies. If Apple should utilize the open source nature of the HTC Vive for positional tracking in a future full-VR implementation, both the Apple VR and Windows-based VR ecosystem can flourish.

While Apple was the first to market with a PC, it didn't set the standard — IBM did that in 1981. Apple wasn't the first to release a MP3 player, but it did it better, and won the market in the end. Samsung released its smart watch a year before the Apple Watch came out, and in every regard, the Apple Watch is the superior product, with Samsung floundering with multiple models and operating systems.

 

Mac Pro Trademark Update

In late April 2017, Apple had updated its trademark in Hong Kong for the Mac Pro, adding in a number of extra terms the mark applies to. Terms including "wireless communication devices" and "home theatre systems" were included on the list, terms which already exist for the Mac mini and iMac trademarks in the country. 

Notably, the list also includes "augmented reality displays, googles, controllers, and headsets; 3D spectacles." This may be a sign Apple is anticipating some application of AR technology with the new device, though it is not a guarantee that Apple will do so in the future. 

Essential Reading