Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple reveals ARKit 3 with motion tracking of people for better presentation and walk-around

The new ARKit 3 features replace hours of frame compositing with real-time calculations that make it quick to build interactive environments and also to place people and objects within the same scenes.

"It's a huge year for ARKit," said Craig Federighi during the 2019 WWDC keynote. He teased that there would be more to say during the rest of the week-long conference, but spent time now showing off "photorealistic rendering, camera motion blur and noise," and more.

"Now, how do you actually model your 3D content?" he asked the audience. "That's where Reality Composer comes in. Reality Composer is a new app featuring drag and drop and a live of high-quality objects and animations, making it incredibly fast and easy to build an interactive environment."

He demonstrated how developers could use Xcode on the Mac to produce the virtual reality content. "But it's also on iOS," he said, "so you can edit, test and tune your app on the device where it will ultimately be delivered."

You can now position objects in front and behind people in an AR scene

The most impressive new feature of ARKit 3 is People Occlusion.

"Now, this is insane," said Federighi. "What used to require painstaking compositing by hand, can now be done in real time."

The feature lets you place virtual objects into a scene and have people walk around them. "By knowing where these people are in a scene, you can see we can layer virtual content in front and behind," said Federighi.

AppleInsider will be reporting live throughout WWDC 2019, starting with the keynote on Monday, June 3. Get every announcement as it happens by downloading the AppleInsider app for iOS, and by making sure to follow us on YouTube, Twitter @appleinsider, Facebook and Instagram.