Early in the Apple Vision Pro development cycle, the company toyed with bespoke VR controllers for the headset, but ultimately decided that eye and finger tracking with cameras was the way to go.
At some point in early development, Apple reportedly company considered a device like a smart ring to control the Apple Vision Pro. According to the latest newsletter from Mark Gurman, the company also reportedly tested third-party VR controllers from HTC at least.
And, it apparently has no intention of developing a VR controller for itself. Apple's distaste for this kind of controller also extends to lack of support for third party controllers — at least for now.
But, it works with game controllers from Microsoft and Sony that are intended for consoles. Apple's macOS and iOS support a wide array of controllers as it stands now, and it's not clear if this range will work on Vision Pro.
Apple has an in-air keyboard for the Vision Pro, of course. The headset will also support a Bluetooth one, or one connected to a Mac.
Apple's work on a "smart ring" may have been a predecessor to Apple Vision Pro
The company has been working on technology surrounding a smart ring for some time. Apple has been researching smart rings plus accessories for it.
Most recently, a newly-granted patent, "Skin-To-Skin Contact Detection," covers multiple ways of detecting "contact or movement gestures between a first body part and a second body part" like snapping fingers, or gestures that were covered in the WWDC keynote that revealed the Apple Vision Pro. That includes options that seem more relevant to Apple Watch bands, or even just a person's hands interlocking, but it comes down to both skin and gesture detection.
One of the example illustrations shows how a person's hand position changes when they press their finger and thumb together, for instance. A second illustration shows when "the index finger is now making contact with the thumb."
Apple's proposed system for that smart ring, and likely the Apple Vision Pro too, will recognize the touch, and also generate "a sense output signal when the index finger and thumb make and break contact."
Other examples of Apple's research on gesture detection show a user pressing a finger of one hand into the palm of the other, which in theory doesn't necessitate a physical controller.
8 Comments
Just like with the Apple Pencil and the iPad, I don't expect an auxiliary device will be the default use for Vision Pro, but I do expect it to be exist. There are simply too many use cases where even the most advanced finger tracking can't do everything one might need in a VR or AR environment.
The eye+gesture control appears to be one of the best things about Vision Pro, but I can see something like wireless haptic gloves with small haptic clickers built into the fingertips coming out eventually, either from Apple or a third party.
I’ll take another run at this, expanding a bit on a previous post.
Think about sound-activated switches, sip-and-puff switches, proximity and capacitance switches (you may have used one the last time you got on an elevator), voice-powered switches, and coming up, EEG switches. The interesting thing here is that *any* single switch can be linked to any device through a universal port (there’s at least one on the device that you’re using to read this right now). Apple in particular has added dozens of accessibility and switching features to its standard iOS and OS systems. Adding that capacity to the Vision Pro should be a cinch.
The Star Raft Project