Apple is continuing the work on an Apple Pencil with a color status display in the tip, and a head-mounted Augmented Reality display that can dim or brighten real-world objects to draw the user's attention.
Apple's rumored augmented reality headset may use a holographic combiner to project light onto the wearer's eyes, while an eye-tracking system for monitoring the gaze of the user could be used to provide more information about an environment based on where the user is actively looking.
Apple "leaked" strings of code, entire frameworks and "read me" files related to its widely-rumored augmented reality headset with Tuesday's iOS 13 Gold Master release and Xcode beta, revealing a few details about the project that was reportedly put on ice earlier this year.
References to stereo augmented reality apps, as well as the codename "Garta," have been spotted within Apple's internal betas of iOS 13 — following and apparently refuting previous reports that Apple had abandoned the concept.
Apple is working on more ideas it could implement in its long-rumored virtual reality headset or augmented reality smart glasses, including a display technology using lasers to create an image in the user's eye, along with magnet-laden gloves to give users finger-level control in the virtual world.
Apple's long-rumored augmented reality or virtual reality headset could offer eye tracking as part of its control system, with the iPhone maker coming up with a way to make such a gaze-tracking system as accurate as possible, even when the headset is in motion.
A future version of the Maps app could provide more intuitive navigational instructions to drivers in the future, by using augmented reality to superimpose the route to take on top of a live view of the road ahead, indicating which lanes a driver should take.
Augmented reality could offer some benefits in medical applications, Apple believes, suggesting a system were a real-world image is warped so a user of AR glasses can see everything in front of them, even if part of their vision is obscured through partial blindness.
In the latest iOS 13 beta, Apple is leaning into its augmented reality prowess to fix a common eye contact issue had during FaceTime calls. AppleInsider goes hands on and dives into the new feature to find out how it works.
VR and AR headsets could offer high refresh rates in their screens, Apple proposes, by using a 'foveated display' and an eye tracking system to monitor the gaze of the user and to optimize rendering to focus only where the user is actively looking on a display.
Apple is looking into ways to display pre-rendered 3D video in a stereoscopic way, a system that would be beneficial for AR glasses or a VR headset to provide a more believable view of a virtual object by creating a specific view for each eye.
Rumors have circulated about an Apple-produced augmented reality headset or smart glasses for quite a few years, but so far the iPhone producer has yet to officially confirm the hardware is in development. The additions made to ARKit in iOS 13 as well as a plethora of patents and other reports indicate Apple is closer to revealing its work than ever before.