'Apple Glass' could have movable display and use Fresnel lenses
Apple's AR headset or "Apple Glass" device may use a movable display to compensate for motion blur, while fresnel lenses could help keep the weight of the headset down for users.
Apple's AR headset or "Apple Glass" device may use a movable display to compensate for motion blur, while fresnel lenses could help keep the weight of the headset down for users.
Future iPhone cameras that capture the direction light is traveling may provide greater 3D detail that will improve Apple AR walk-through experiences.
The long-rumored "Apple Glass" could provide digital images in AR that appear to be completely solid when overlaying the view of the environment, by selectively blocking elements of the screen to stop light from passing through.
The rumored Apple Glass could take advantage of force-sensing gloves, with a collection of sensors used to detect hand movements including typing on a virtual keyboard and how the user grips physical objects.
Apple wants to avoid the discomfort of a user taking off an Apple AR headset like "Apple Glass," and going from darkness to the brightness of the real environment around them.
Apple's augmented reality headset could automatically account for the difference between a user's point of view and an attached camera, while peripheral lighting could make it easier for a user's eyes to adjust to wearing an AR or VR headset.
A questionable new report from the supply chain says that Apple's testing of "Apple Glass," or Apple AR headset, has been delayed in testing.
Rather than passively display virtual Apple AR objects, "Apple Glass" may allow users to share 3D data, and manipulate it in editing apps.
Apple AR could feature Spatial Audio, as already used in the HomePod, HomePod mini, or AirPods Pro — but also recorded on compact devices.
Apple's augmented reality plans could involve creating a 3D model of a user's fingers for interacting with virtual touchscreens, while the use of infrared optical markers may make it easier for AR systems to adjust what is seen on an iPhone or iPad's screen.
Apple is expected to use Fresnel lenses in its rumored augmented reality headset to increase the device's field of view and keep weight less than 150 grams, according to analyst Ming-Chi Kuo.
Apple's AR and VR systems could use wearable controllers with extending sections, allowing for the sensing of the user's fingers pressing a surface without covering the fingertips.
The former materials lead at Apple has been sued by the company, with the complaint addressing alleged misappropriation of trade secrets that were then sold to an unnamed publication in exchange for favorable coverage of a startup.
Apple will be bringing out its first foray into augmented reality hardware in mid-2022, analyst Ming-Chi Kuo forecasts, which could be followed a few years later by the release of "Apple Glass" in 2025 and contact lenses by 2030.
Microsoft Mesh, making its debut at the annual Ignite conference, is an Azure-powered AR and VR platform that will one day allow users to collaborate on projects in real-time using "holoportation" and will give developers tools to create other experiences.
Apple's long-rumored AR and VR efforts including "Apple Glass" could use smart gloves to interact with the system, using magnetic fields to determine where a user's individual fingers are located.
Apple is developing a system that could generate mixed reality or virtual reality environments from flat video content, letting users walk through and explore clips that would otherwise only be two-dimensional.
Analyzing what a user is looking at may help Apple Glass or an AR headset improve its performance in processing video, by using the gaze to prioritize elements of a camera feed for analysis.
An Apple VR or AR headset may be able to automatically adjust the lenses placed in front of the user's eyes by using fluids to deform the shape of the lens to improve the user's eyesight.
Apple's augmented reality headset or a future iPhone could use light from a display to track the movement of nearly any surface, while finger devices may be able to provide detail to the system about what kind of objects a user may be touching.
{{ summary }}