Apple "leaked" strings of code, entire frameworks and "read me" files related to its widely-rumored augmented reality headset with Tuesday's iOS 13 Gold Master release and Xcode beta, revealing a few details about the project that was reportedly put on ice earlier this year.
Uncovered by ever intrepid developer Steve Troughton-Smith, the AR headset information is openly listed in files and complete folders issued with the iOS 13 GM.
Dubbed "StarBoard," the system shell describes an AR headset or similar apparatus capable of running stereoscopic, or stereo, content. According to Troughton-Smith, code strings point to multiple ways of accessing AR content, including wearing or holding an "HME" device, as well as potentially connecting a specialized gamepad for use with the wearable.
StarBoard appears to operate in a manner similar to CarPlay in that content is offloaded to an external display. While exact working details are unknown, Troughton-Smith guesses that a "dashboard" displaying compatible stereo AR apps will appear when the presumed headset is connected to an iPhone or similar computing device.
"Very curious to see whether the headset or the iPhone runs the StarBoard shell itself, but seems very much like the iPhone does the rendering," Troughton-Smith said in a tweet.
More interesting is a read me file that runs through procedures for testing stereo AR apps on iPhone, a contingency apparently made for employees who do not have access to the headset. The file notes internal iOS builds come with a "STARTester" app that can switch its displayable content in and out of HME mode when used with another asset called "starboardct1."
Setting the system state to a "worn" HME configuration, for example a prototype dubbed "Garta," will apply distortions associated with that device. Setting the state back to "held" returns the app back to "normal" mode, which is assumedly typical 2D rendering.
Apple has for years been rumored to be working on an AR headset under the codename "T288." Previous reports claimed the company tested a variety of prototype hardware including 8K eyepieces and WiGig 2.0, along with a slew of onboard sensors fed to a separate processing unit.
Analyst estimates pegged launch for 2020, but reports in July claimed Apple put the project on ice. Judging by today's revelations, however, work on the initiative is ongoing.