The new ARKit 3 features replace hours of frame compositing with real-time calculations that make it quick to build interactive environments and also to place people and objects within the same scenes.
"It's a huge year for ARKit," said Craig Federighi during the 2019 WWDC keynote. He teased that there would be more to say during the rest of the week-long conference, but spent time now showing off "photorealistic rendering, camera motion blur and noise," and more.
"Now, how do you actually model your 3D content?" he asked the audience. "That's where Reality Composer comes in. Reality Composer is a new app featuring drag and drop and a live of high-quality objects and animations, making it incredibly fast and easy to build an interactive environment."
He demonstrated how developers could use Xcode on the Mac to produce the virtual reality content. "But it's also on iOS," he said, "so you can edit, test and tune your app on the device where it will ultimately be delivered."
The most impressive new feature of ARKit 3 is People Occlusion.
"Now, this is insane," said Federighi. "What used to require painstaking compositing by hand, can now be done in real time."
The feature lets you place virtual objects into a scene and have people walk around them. "By knowing where these people are in a scene, you can see we can layer virtual content in front and behind," said Federighi.
AppleInsider will be reporting live throughout WWDC 2019, starting with the keynote on Monday, June 3. Get every announcement as it happens by downloading the AppleInsider app for iOS, and by making sure to follow us on YouTube, Twitter @appleinsider, Facebook and Instagram.
1 Comment
The depth sensitive instant tracking and matting was the highlight.