How Oculus Plans to Fix One of VR's Biggest Problems

Oculus Touch’s two handheld controllers let users feel vibrations that correspond to their actions in a virtual world.
zXp9PQ1lGOgm8kn3SYttfGEkQ4mnBC5R6p08QNi1nYg
Oculus

You’re in an unfamiliar world. You turn right and see lush vegetation. Turn left, and you spy an adorable woodland creature. You reach out to pet it—and nothing happens. Your arms stay limp at your side. You wonder if you have arms at all. Then you remember that the world is virtual, and the headset that brought you there is strictly look, don’t touch.

It’s a familiar limitation for head-mounted virtual-reality displays like the Oculus Rift. It can fool your eyes but leaves the rest of your body stuck on the boring side of the looking glass. It’s a serious barrier to a truly immersive experience, one that Oculus will attempt to solve with a new accessory called Oculus Touch.

Oculus Touch, announced Thursday alongside the consumer-ready version of the Rift headset, comprises two handheld controllers—think double-fisted Wii remotes as imagined by Darth Vader. Each provides haptic feedback, allowing users to feel vibrations that correspond to their actions in a virtual world (think pulling the trigger on a gun), along with six degrees of freedom tracking—the same type used in the Rift itself—and a matrix of sensors that recognize poses like pointing and giving a thumbs-up. Presumably, this means you’ll be able to give invading alien hordes not just a phaser blast but a middle finger.

“I’ve seen thousands of people inside the Rift now,” Oculus founder Lucky Palmer said during the Oculus Touch reveal. “And one of the first things people do is they reach out into this virtual world. It’s something entirely new to them. It’s the natural reaction to something they’ve never experienced before.”

e8z0dWwQOQfc6FU6qVhwfRIdVUvtiTCjWDPW8mRquVQOculus Now, at least, when they reach out, they’ll have something to grab.

In practice, Oculus Touch means users will be able to interact more naturally with the virtual world instead of passively observing it, or actively embracing the artifice of button mashing and directional stick twiddling that comes with a traditional single-controller setup. Pick up a gun, fire the gun, throw the gun aside; the Oculus Touch wants those actions to require the same reaches and squeezes they would if that gun were made steel instead of pixels.

You’ll be able to give invading alien hordes not just a phaser blast but a middle finger.

That’s important if you want to push virtual-reality gaming beyond the single-controller inputs that currently dominate, says Gartner tech analyst Brian Blau. “When you think of what VR could be, how to make it really useful, how to make the experience that you’re having optimized, the two-handed interface is more appropriate,” Blau says. “It means you can be more expressive inside the virtual world. That is going to be really appropriate for a lot of VR use cases, games and otherwise.”

Blau notes that Touch isn’t the first two-handed virtual-reality input system. Most recently, HTC’s Vive prototype headset employs a pair of wands. Based on Palmer’s presentation, though, Oculus appears to offer far more features than any remotely similar solutions have to date. More important, it will actually ship to consumers sometime in the first half of next year, instead of languishing in development limbo.

When it does ship, it should introduce people to a sense of virtual presence that few have experienced. “There is a special place in virtual reality—we call it Near-Field VR,” says Mark Bolas, director for mixed reality research at USC’s Institute for Creative Technologies. “It is the place that is within arm’s reach of a user, and it is magical, as it provides the best stereoscopic and motion cues of VR. Hands are very important to enable interaction in this region.”

The catch, of course, is no one beyond the Oculus inner circle has used Touch yet. And while it sounds supremely helpful on paper, there’s a chance it could detract from the virtual experience rather than enhance it. Incorporating hands benefits VR users plenty, but it’s an insignificant add-on compared to what really matters: your brain.

“The most important input device in virtual reality is your head, which is the way one navigates a scene in virtual reality,” says Jeremy Bailenson, founding director of Stanford University’s Virtual Human Interaction Lab. “It’s critical to track the head position and rotation well, and if adding hands to the system in any way detracts from the ability to achieve head tracking, the overall experience might suffer.”

There’s a chance it could actually detract from the virtual experience rather than enhance it.

That question should be put to rest soon enough; hands-on demonstrations will be available at next week’s E3 gaming conference. And even if there are wrinkles, Oculus has about a year to iron them out. Besides, what’s most interesting about Touch might be what it represents about the overarching Oculus approach to VR.

When the first Oculus Rift consumer headsets ship, they’ll come with an Xbox One controller—the very sort of single-unit input that Oculus Touch could displace. That the two will coexist, though, shows Oculus recognizes that there’s no one correct solution. It’s all a matter of what you’re trying to achieve.

“VR experiences have to be designed with a particular purpose in mind,” Blau says. “I don’t think it’s reasonable to think that you’re going to have a generalized VR app. It’s not going to work that way.”

Palmer seems to agree. “We see virtual-reality input evolving over the coming years,” the Oculus exec explained Thursday. “There will be different inputs for different kinds of games.” While Oculus Touch might be the best way to, in his example, “pull robots limb from limb,” the best way to translate a first-person shooter video game to a virtual environment will still be the quick-trigger controller with which gamers are familiar.

That flexibility should help push Oculus even further forward, too. “The next step will be to not just track the hands but represent details such as the fingers,” says Bolas, whose team already has figured out how to “trick” a hand-tracker into determining fingertip location. That kind of fine-tuned control could also move Oculus beyond gaming altogether, into areas that range from physical rehabilitation to social media.

For now, though, it’s impressive enough that Oculus has found a way to let its users not just play in its worlds but touch them as well.