Get the Lowest Prices anywhere on Macs, iPads and Apple Watches: Apple Price Guides updated March 24th


Simply put, augmented reality (AR) superimposes a digital image on top of a view of the "real world," giving the illusion to the viewer that there is an object in front of them that in reality doesn't exist. For example, an iPad's rear camera can show a feed of a room, but the screen can show a non-existent chair placed within the environment, giving the illusion that it exists, albeit on-screen. 

While apps have experimented with AR in the past, Apple's ARKit is meant to facilitate the development of AR apps, by Apple handling some of the more difficult issues that would be challenging to a developer, such as light matching, surface detection, and motion tracking.  

Initial Reveal

Revealed at WWDC 2017, ARKit is a developer toolset that can allow developers to easily create augmented reality features for their apps, using existing Apple hardware. Demonstrated on stage, the framework will work with most iOS devices, with the larger screens of iPads used to show off the capabilities during the event, rather than iPhones. 

A scene from Wingnut AR, a Peter Jackson company, was depicted on a table from the viewpoint of a held iPad, with the scene including buildings, characters, and various other objects. As the scene developed, the diorama stayed in the same place on the table top while the iPad moved around, demonstrating the tracking capabilities of ARKit. 

Live gameplay powered by Unreal Engine 4 that used ARKit was also featured. 

What ARKit does

According to Apple's developer pages, ARKit uses Visual Inertial Odometry (VIO) to accurately track the world, combining camera sensor data with CoreMotion data. It is claimed these two inputs will allow the iOS device to sense how it moves within a room with a high degree of accuracy, without additional calibration. 

The rear camera is used to capture a live feed of the surroundings, noticing changes in the picture when the angle of the iOS device changes. This is combined with the movements detected within the CoreMotion data to recognise the overall movement and viewing angle of the iOS device in relation to the environment. 

ARKit can also analyze a scene presented by the camera view and find horizontal planes in a room, such as a table or a floor, which can then be used to place digital objects. Again, using the camera data in tandem with movement information, it can see potential horizontal planes within the camera's view, and can also keep track of their positions when it moves out of frame temporarily. 

The system also uses the camera sensor to estimate the light of the environment. This data is useful in applying lighting effects to virtual objects so they match the real-world scene closely, furthering the illusion that the virtual item is actually in the real world. 

ARKit is also capable of performing face tracking, using the iPhone X's front-facing TrueDepth camera for other functions. By creating a face mesh based on data form the TrueDepth camera, it is possible to add effects to the user's face in real time, such as applying virtual makeup or other elements for a selfie. 

The same system can also be used for animating a 3D model or avatar, such as Apple's own Animoji feature. 

Developer Reaction and Initial Response

The developer response to ARKit's tools is "unbelievable," according to Apple VP of worldwide iPod, iPhone, and iOS product marketing Greg Joswiak in a late June interview. Noting quickly developed projects ranging from virtual measuring tapes to an Ikea shopping ap, Joswiak said "It's absolutely incredible what people are doing in so little time." 

"I think there is a gigantic runway that we have here with the iPhone and the iPad. The fact we have a billion of these devices out there is quite an opportunity for developers," said Joswiak. "Who knows the kind of things coming down the road, but whatever those things are, we're going to start at zero." 

The Apple developer community started to show off their creations very briefly after the launch of ARKit itself, revealing the relative ease of learning and using the AR platform. 

An early example was an app from SmartPicture that is used to measure a kitchen or other room, using the iPhone's rear camera. While impressive, the developer advised the measuring accuracy would improve over time. 

Steve Lukas created a video to show off the promotional possibilities of ARKit, via the Disney D23 expo. A brief video shows the front of a conference center, with a digital Mickey Mouse placed outside the center's entrance. 

Other examples include applications in urban planning, tabletop gaming, and furniture placement apps. On the more frivolous side, Trixi Studios developed an interactive app based on 80's band A-Ha's "Take on Me" music video, which allowed the user to "step through" the frame into the drawn world. 

At the end of August, more demonstrations of what ARKit can do were revealed, demonstrating how far developers have gotten to grips with the tools in a short space of time. Alongside a sculpting app that used the Apple Pencil, the more impressive proof of concept came from Kabaq, with near-photographic quality foods shown in AR on a plate, something which could provide restaurant diners not only assistance in seeing what their potential order could look like, but also the portion size. 

After seeing developers embrace ARKit in such a short space of time, Apple took the initiative to promote app developer's work by inviting them to meet the press at its Cupertino headquarters at the end of August. The event, meant to promote the utility of ARKit, featured demonstrations for a "Walking Dead" game called Our World as well as an AR experience based on the book The Very Hungry Caterpillar. 

Announced at the end of November, the first public art exhibition using ARKit will be at the Perez Art Museum Miami starting from December 5. The exhibition, poweered by Apple hardware and development tools, will feature works by Miami-based artist Felice Grodin, and is funded through a grant from the John S. and James L. Knight Foundation. 

More digital artwork will be debuted in early 2018, with one in January and another in the spring. The show itself will run at PAMM until April of 2018. 

Months after its availability to developers, a survey by app market intelligence firm Apptopia suggests interest in ARKit is waning, with the number of new apps using ARKit falling sharply after its September debut, hitting a new low in November, and rebounding slightly in December. Its figures also claims there are still less than 1,000 apps with ARKit capabilities in the App Store, collectively achieving more than 3 million downloads. 

Of the ARKit apps, games make up 30 percent of the total collection, with entertainment and utilities making up 13.2 percent and 11.9 percent respectively. 


Development Resources

On August 29, Apple updated its developer portal with more information about ARKit, adding more AR-specific entries to its Human Interface Guidelines for iOS. Apple expains in the developer document that AR apps should use the entire display and minimize onscreen clutter, to consider physical constraints and user comfort when creating interactive AR apps, and other important best practices developers should follow. 

New assets in the portal included sample code for creating AR experiences, including examples on using audio and interactive content. 

Apple's documentation teaching about using ARKit includes guides on how to build a developer's first AR experience, dealing with 3D interaction and UI controls in AR, handling audio in AR, and creating face-based AR experiences. 

Apple has also added lessons in its Swift Playgrounds app, giving younger and inexperienced developers an introduction to the framework. The "Augmented Reality" Challenge teaches users commands to enable the iPad's camera, to handle when planes are detected, to place a character on a plane, and to make the scene interactive. 

In October, Apple updated its developer webpage with a brief notice, urging app producers to use its ARKit framework and to create and promote augmented reality apps. The message outlined key resources for developers to use for creation and marketing, including interface design guidance, using the iPhone X's TrueDepth camera, and a reference guide to creating app previews for AR software for the App Store. 

While it has optimizations in Metal and SceneKit, it is also possible for ARKit to be incorporated into third-party tools. 

ARKit has recieved a boost from Epic, the creators of the Unreal Engine, with the game engine including basic support for ARKit as part of an early August update. Alongside other VR-related improvements, the update includes an "early, experimental" implementation of ARKit. 

The Unreal Engine is well known for being used by developers to create games, and in recent years has become easier for developers to start creating games using the platform. By adding ARKit support, Epic has opened up Apple's AR platform to a wider collection of developers, who can quickly use the features in an engine they already know how to use. 

Amazon unveiled a new app platform for VR, AR, and 3D content called Sumerian on November 27. The platform, which does not require specialized 3D or programming knowledge beforehand, can allow users to build and publish apps entirely on the web, with support for ARKit included as part of the framework. 

Apple will be hosting an "Introduction to ARKit" session at this year's Game Developer Conference this March. The session will aim to introduce attendees to the core concepts of the ARKit framework, including the basic principles and associated API, as well as how to start using the tracking and scene-understanding capabilities of the framework, and how to integrate it with game engines. 

In mid-February 2018, Apple published a new webpage dedicated to augmented reality for iOS. Featuring sections for Productivity, Learning, and Play, the site highlights notable AR apps for each area, along with an explanation of what iOS 11 and ARKit can do using the improved hardware found in the iPhone 8 and X. 


ARKit will be limited to only newer iOS devices at first, with Apple's developer information specifying compatibility with the A9, A10, and A11 families of processors. This effectively makes the ARKit compatible with the iPhone 6s, iPhone 7, and iPhone SE ranges, as well as the iPad Pro lines and the 2017 iPad. 

The iOS 11 release as announced supposedly supports the iPhone 5S and newer, the iPad Air and newer, and the latest iPod Touch, suggesting ARKit will expand its compatibility in the future. 

Version 1.5

As part of its preview for iOS 11.3, due to release in the spring of 2018, Apple revealed ARKit would be updated to version 1.5, among other features arriving in the operating system update. 

As part of the enhancement to ARKit, it would allow apps to recognise and place virtual objects on walls and doors, extending the horizontal plane discovery to find vertical spaces. ARKit would also benefit from improvements to its surface mapping to tackle irregularly shaped surfaces. 

The real-world view from the rear camera used in the AR view would also be updated to have a 50 percent greater resolution. 

Apple included ARKit 1.5 in the first developer beta of iOS 11.3, and developers have already started to take advantage of its advances. Image recognition of book covers, examples of future movie posters, and other uses were created and shared by developers within days of its release. 

On February 21, Apple updated its Human Interface Guidelines for iOS with new entries addressing AR, including best practices for creating apps using ARKit 1.5.