Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple's ARKit 4 anchors 3D reality into real-world Maps locations

in ARKit 4, virtual objects can be anchored to real world locations

With ARKit 4, Apple is building the foundation for a virtual world of animated, interactive 3D "reality" explorable by anyone with a newer iPhone in their pocket.

At WWDC20, Apple outlined futuristic new features coming to ARKit 4. Previous releases of the company's augmented reality platform first enabled basic 3D graphics to appear fixed in place in the video stream from an iOS 11 camera using Visual Inertial Odometry. This let a user explore a virtual object from all sides or play an interactive game fixed via AR onto a table surface.

ARKit 2 in 2018's iOS 12 introduced shared AR worlds, where two users could see different views of the same virtual scene, enabling multiuser gameplay in augmented reality apps. Last years' ARKit 3 introduced motion capture and people occlusion, which allowed virtual objects to move in front and behind people, while understanding how a person was positioned in front of the camera.

Apple's investments in a collection of AR and related technologies use computer vision, machine learning, and motion detection to build increasingly sophisticated, interactive worlds of computer graphics fixed in place within a camera view. It has ended up being far more successful and impactful than competitive efforts to sell smartphone VR over the last several years.

Location Anchors

ARKit 4 in the freshly announced iOS 14 adds Location Anchors, which can fix an AR model at a specific location in the real world, defined by latitude, longitude, and elevation. This could be used to present virtual artwork, as Apple demonstrated with a KAWS installation positioned at the Ferry Building in San Francisco. It could also be used to position labels that are fixed in space at a specific location, or to build entire AR experiences at a given spot.

To provide more accuracy in anchoring a location than GPS can provide on its own, Apple demonstrated ARKit 4 using visual location, which uses machine learning to match up landmarks seen by the camera with a localization map downloaded from Apple Maps corresponding to the current location. This appears to be data collected by Apple Maps vehicles to build out the explorable Look Around Maps in major cities.

The location matching process is done locally on your phone, with no private information being sent back to Apple's servers. This impressive intersection of technologies could further be paired with other features Apple has been detailing at WWDC20, including App Clips to provide instant functionality, including payment using Apple Pay or a user login using "Sign in with Apple."

Future applications of these technologies offer obvious use cases for wearable AR, including the rumored "Apple Glass" that could navigate points of interest and allow a user to interact with anchored AR experiences to find more information or handle a variety of sophisticated app transactions.

Depth API and LiDAR

In addition to anchoring an AR experience to a fixed point the real world, ARKit 4 also now provides advanced scene understanding capabilities with a new Depth API. This enables it to use the LiDAR scanner on the newest iPad Pro — and which is rumored to appear on an upcoming iPhone 12 model— to rapidly capture a detailed mesh of depth information of the surrounding environment.

Rather than scanning the scene with the camera first, LiDAR enables immediate placement of virtual objects in an AR experience, such as a game that interacts with real-world objects in the room.

The new Scene Geometry API can create a topological map of the environment, which can be used together with semantic classification to identify physical objects and distinguish between the floor, walls and other objects, and understand the depth of objects in a scene and how they are arranged.

ARKit 4 can then place virtual objects in the scene in front of or behind occluded people or identified objects in the scene; use game physics to simulate realistic interactions between virtual and physical objects; and realistically light using raycasting to blur the line between what's real and the digital content augmenting reality.

Face and hand tracking

A third major advancement in ARKit 4 expands face tracking beyond devices equipped with a TrueDepth camera to the new iPhone SE and other products with at least an A12 Bionic processor.

Face tracking captures face anchors and geometry to allow graphics to be applied to the user's face, either to create a Memoji-like avatar that captures the user's expressions to animate a virtual character, or to apply virtual makeup or other lighting effects similar to Portrait Lighting in the iOS camera.

Apple has also added hand tracking to its Vision framework, enabling an iOS device to recognize not just full-body movements but individual poses of the fingers of a hand. One demonstration showed how a user could spell out words in the air, just by the camera watching and identifying precise hand movements.

Vision framework hand capture

Reality tools for building AR experiences

Apple is also providing a new Reality Converter app for bringing 3D models developed in an outside digital content creator tool into the usdz file format used by ARKit, as well as an update to Reality Composer for building and testing AR experiences and exporting them back into the portable usdz.

RealityKit also adds support for applying a video texture within an ARKit scene, such as a virtual television placed on a wall, complete with realism attributes such as light emission, alpha, and texture roughness.

The work Apple is doing in AR overlaps existing work others have started, notably Google. But one thing dramatically supporting Apple's efforts in AR is the company's vast installed base of high-end, sophisticated devices with structure sensors like TrueDepth on iPhone X and later and the new LiDAR sensor on new iPad Pros.

Apple launched ARKit three years ago and immediately became the world's largest AR platform, meaning that developers have a variety of opportunities for building experiences that large numbers of real-world users can experience. The company is still just getting started, and we can expect it to increasingly deploy new technologies that extend ARKit features into new directions, potentially including wearables glasses and vehicle windshields in addition to its current installed base of iOS and iPadOS handheld mobile devices.