Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple accounts for new ARKit 1.5 features in updated developer resources

Last updated

Apple this week updated its Human Interface Guidelines for iOS with new entries covering augmented reality experiences, making adjustments and additions to best practices used by developers creating apps with ARKit 1.5.

Set for introduction in iOS 11.3, ARKit 1.5 delivers a set of enhanced AR tools that allow developers to better place virtual objects in a scene. For example, the updated software is able to identify vertical surfaces, features improved mapping for irregularly-shaped surfaces and provides users with a 50-percent increase in resolution when viewing real world objects through a device's camera.

To help developers get a handle on the new technologies, Apple tweaked its HIG for iOS to include mention of vertical surfaces.

In a section regarding on-screen indicators, Apple offers the example of a trapezoidal reticle that might help users understand that they need to find a horizontal or vertical flat surface on which to place a virtual object. Previously, this particular entry limited notation to horizontal surfaces, as ARKit 1.0 was unable to detect the vertical plane.

With ARKit now supporting both horizontal and vertical surfaces, Apple added a few particulars on 3D space orientation.

If the indicator's orientation follows the alignment of the detected surface, it can help people anticipate how the placed object will be aligned.

Related to app indicators, Apple covers similar themes in user interaction with virtual objects. On the topic of rotation, the company's guidelines now suggest a virtual object should "generally occur" relative to the surface on which it rests, whether that be on a horizontal or vertical plane.

ARKit 1.5 features enhanced image recognition capabilities that can power advanced features like interactive posters or book covers. As such, the HIG now includes a section entitled "Reacting to Imagery in the User's Environment."

For example, an app might recognize theater posters for a sci-fi film and then have virtual spaceships emerge from the posters and fly around the environment. Or, an app for a retail store could make a virtual character appear to emerge from a store's front door by recognizing posters placed on either side of the door.

Developers should provide reference images, including information on physical size, to optimize detection, Apple says.

The image detection guidelines offer a peek at Apple's ARKit capabilities, and its limitations. For example, ARKit does not track changes to the position or orientation of detected images, meaning precise placement of virtual assets could pose a problem. Further, ARKit performs best when the host app is searching for 25 or fewer distinct images in a given environment.

Apple offers a few tips on virtual object handling, like guiding users toward an offscreen asset using indicators.

Finally, a separate section covers potential interruptions to the AR experience, dissuading developers from pulling users out of an engaging session to perform simple tasks like changing the fabric of a virtual chair, for example. This segment also discusses how ARKit handles relocation of objects following a session interruption, perhaps when a user switches between apps.

Users of iOS devices can expect to take advantage of the first ARKit 1.5 apps when iOS 11.3 sees release in the coming weeks.



There are no Comments Here, Yet

Be "First!" to Reply on Our Forums ->