Meta is revolutionizing the way we interact with virtual reality by introducing hand tracking to their Quest VR headsets. Imagine effortlessly tapping, scrolling, and swiping through virtual objects with just the flick of your fingers in the air. It’s like having the power of a smartphone at your fingertips, quite literally! No more clunky controls, just pure intuitive gestures that make you feel like a digital magician. Get ready to dive into a whole new dimension of immersive experiences with Meta’s cutting-edge hand-tracking technology.
Get ready for an immersive experience as I dive deep into the mesmerizing world of hand tracking on the Meta Quest 2! After weeks of anticipation, the highly anticipated update finally arrived, and I wasted no time in activating the groundbreaking feature named “Direct Touch.” Join me on this thrilling journey as we explore the wonders of this experimental technology in the Quest v50 software version.
How does Hand Tracking Work?
Inside-out cameras on the Meta Quest headset are how hand tracking functions. The positioning of your hands, their alignment, and the arrangement of your fingers are all detected by your headset. Algorithms for computer vision are used to recognise your hands and then follow their position and orientation.
When hand tracking is activated, the Quest 2 follows your hands with its external-facing cameras, and within the headset, you’ll see them as shadowy, dark hands. (CEO Mark Zuckerberg’s Direct Touch film appears to have been shot on a Quest Pro and includes additional hand and arm detail.)
Imagine being able to predict when your hand will effortlessly glide across a menu or window, thanks to the subtle shadows guiding your every move. With Direct Touch, the moment you make contact with an item, it comes to life, gracefully scrolling and illuminating your way. And while the scrolling may have a touch of quirkiness, it’s surprisingly more responsive than you could have ever imagined.
What Was It Like Before Hand Tracking?
Direct-touch typing is an absolute nightmare! But fear not, because the Quest’s onscreen keyboard is here to save the day. Picture this: you tap on the user interface, and voila! The keyboard magically appears underneath the window, just waiting for you to unleash your typing skills. And get this, you can even “press” specific keys to bring your words to life. It’s like pure typing magic!
It’s difficult to know where or what you’re truly typing, though, because there is nowhere for your hands or fingers to rest. (Consider the onscreen keyboard on an iPad without any feedback, and then picture a world without glass.) The UI occasionally assumes that I accidentally touched a different key than what I meant to when I resort to VR hunt-and-peck to futilely compose even a single word.
How to Turn On or Off Hand Tracking
- To access your universal menu, press the “Quest Button” on your right Touch or Touch Pro controller.
- Hover your cursor over the clock on the global menu’s left side. Selecting Quick Settings will bring up the Quick Settings panel.
- Right-click Settings in the upper corner.
- From the left menu, choose Device, then pick Hands and Controllers.
- Toggle Hand Tracking on or off by selecting the toggle button next to it.
You can also toggle the automatic switch for hand tracking on or off from this point. If you have this setting enabled, whenever you put down or pick up your controllers, your headset will automatically switch between using your hands and the controllers.
You must manually enable and disable hand tracking if this setting is off.
Hand Tracking Gestures You Can Use
- Point and Pinch To select something
Point your hand in the direction you want to select when the cursor appears. Next, choose by pinning your thumb and finger together.
- Pinch and Scroll: Scrolling up, down, left, or right
Fingers are pinched inward. You can scroll by moving your palm up, down, left, or right while still clenching inward. Release once you have finished scrolling.
- Palm Pinch: Brings you back to your Meta Home menu.
Hold your thumb and index finger together until the space between them fills, then let go to look at your palm at eye level.
Direct Touch is an intriguing experiment, but let’s be honest, using the Quest for more than a few minutes can be a bit frustrating. I mean, can we really trust that our hand is actually touching those virtual components in the air? And don’t even get me started on the exhaustion of holding our arms up to navigate the user interface. However, I must admit that Meta’s other controller-free hand gestures, like pinching, are pretty darn reliable, even if they’re not as obvious.
Having said all of that, I still believe the concept of Direct Touch is really cool. Even though my words-per-minute drop by 99 percent and I don’t think any of my taps will operate as I expect, scrolling and tapping on imaginary surfaces in my VR headset gives me the feeling that I’m living out some sort of sci-fi dream.
Using my hands instead of the Quest’s controls is also much more practical when Direct Touch operates as planned. I realise that’s a big asterisk, but putting on the headset and using my hands to navigate through stuff takes a lot of the friction out of donning the Quest.
Furthermore, it’s easy to see where this technology can lead, particularly if Meta’s still-future AR glasses truly materialise. You probably won’t want to bring a controller or two with you while wearing those glasses outside, as you could just use your hands instead.
Additionally, we might not only be interacting with Meta devices with our hands in the air; it appears plausible that Apple is also investigating similar kinds of interactions with its long-rumoured mixed reality headset, which could allow users to text on onscreen keyboards.
I’m going to continue primarily using the Quest’s controls for the time being. But if I only need to fast-check something on my headset, I might put the controllers down and try to do it with my hands. Even though it can take three times as long, it’s much cooler.