Get the Lowest Prices anywhere on Macs, iPads and Apple Watches: Apple Price Guides updated January 25th

ARKit is Apple's framework and software development kit that developers use to create augmented reality games and tools. ARKit was announced to developers at the 2017 WWDC, and demos showed developers how they could create AR experiences for their apps. 

ARKit is currently on its third iteration, and is referred to by Apple as ARKit 3.

What is augmented reality?

Augmented reality -- often stylized as AR -- is the act of superimposing a computer-generated image into the "real world," usually via a smartphone camera and the screen. People often compare AR to VR -- or virtual reality. Virtual reality, however, is the act of creating a fully simulated environment that a user can "step into."

Current examples of augmented reality are games like Pokemon Go, which features an AR mode that allows users to hunt, photograph, and catch Pokemon via their smartphone's camera. Social media platforms such as Instagram and Snapchat use augmented reality filters to encourage users to share photographs and videos.

ARKit is also the framework that powers the iOS app Measure.

Some retailers allow users to preview items in their homes via augmented reality as well.

While AR is primarily confined to smartphones and tablets right now, there has been a surprising amount of evidence pointing to Apple developing an AR headset and AR glasses.

What ARKit does

According to Apple's developer pages, ARKit uses Visual Inertial Odometry (VIO) to accurately track the world, combining camera sensor data with CoreMotion data. These inputs allow the iOS device to sense how it moves within a room with a high degree of accuracy, eliminating the need for additional calibration. 

The either camera is used to capture a live feed of the surroundings, tracking differences in the picture when the angle of the iOS device changes. Combined with the movements detected within the CoreMotion data, ARKit recognizes the motion and viewing angle of the iOS device to the environment. 

ARKit is capable of finding horizontal planes in a room, such as a table or a floor, which can then be used to place digital objects. It also keeps track of their positions when it moves out of frame temporarily. 

The system also uses the camera sensor to estimate the light of the environment. This data is useful in applying lighting effects to virtual objects, helping to match the real-world scene closely, furthering the illusion that the virtual item is actually in the real world. 

ARKit is can track a user's face via the iPhone's TrueDepth camera. By creating a face mesh based on data from the TrueDepth camera, it is possible to add effects to the user's face in real-time, such as applying virtual makeup or other elements for a selfie. The Animoji feature also uses this system.

Apple's ARKit documentation includes guides on how to build a developer's first AR experience, dealing with 3D interaction and UI controls in AR, handling audio in AR, and creating face-based AR experiences. 

Apple has also added lessons in its Swift Playgrounds app, giving younger and inexperienced developers an introduction to the framework. The "Augmented Reality" Challenge teaches users commands to enable the iPad's camera to handle when planes are detected, to place a character on a plane, and to make the scene interactive. 

While it has optimizations in Metal and SceneKit, it is also possible to incorporate ARKit into third-party tools. 

Added in ARKit 3

ARKit 3 brought quite a bit of fine tuning to the system, employing machine learning and improved 3D object detection. Complex environments are now capable of being more accurately tracked for image placement and measurement.

People Occlusion has been added with the advent of ARKit 3. This allows a device to track people and allow digital objects to pass seamlessly behind or in front of them as needed.

It also brought Motion Capture, giving the device the ability to understand body position, movement, and more. This enables developers to use motion and poses as input to an AR app.

ARKit 3 allows for simultaneous front and back camera tracking. Users can now interact with AR content from the back camera by using facial expressions and head positioning. 

Up to 3 faces can be tracked with ARKit Face Tracking, using the TrueDepth camera on the iPhone X, iPhone XS, iPhone XS Max, iPhone XR, iPad Pro, and the iPhone 11 line. 

Developer Reaction and Initial Response

The developer response to ARKit's tools is "unbelievable," according to Apple VP of worldwide iPod, iPhone, and iOS product marketing Greg Joswiak in a late June interview. Noting quickly developed projects ranging from virtual measuring tapes to an Ikea shopping ap, Joswiak said, "It's absolutely incredible what people are doing in so little time." 

"I think there is a gigantic runway that we have here with the iPhone and the iPad. The fact we have a billion of these devices out there is quite an opportunity for developers," said Joswiak. "Who knows the kind of things coming down the road, but whatever those things are, we're going to start at zero." 


ARKit was released alongside iOS 11, which meant any device capable of running iOS 11 would be able to utilize AR features. 

ARKit 3 still has features that are compatible with devices running iOS 11, but it's more advanced features are restricted to devices that feature A12 Bionic chips or better. The iPhone XR, iPhone XS and iPhone XS Max feature the A12 Bionic chip. The three phones in the iPhone 11 line all feature the A13 Bionic chip. iPads released after third generation iPad Pro feature A12 Bionic chips or better.

Essential Reading