X

Tech Talk: VR Nausea Solved, Next Stop: Full Immersion

Prolonged use of any virtual reality system can have seriously devastating consequences, as a man who spent 25 hours with the HTC Vive to pursue a world record found out the hard way. Even in short bouts of use, an hour or less, some users end up experiencing dizziness, nausea and a host of other signs that their body doesn’t approve of their virtual adventures. There are a number of biological reasons for this phenomena, but what they all essentially boil down to is a huge discrepancy, on many levels, between what your eyes and ears are experiencing and what your body is experiencing. The brain and body are having entirely different experiences when a user is immersed in VR, causing many senses and functions to lose any sense of what to do with themselves. A big part of the reason that discrepancy is so pronounced is the fact that there is a lot of black space in a given VR view field. A user sees and experiences one thing, feels another, and has blackness all around to remind their brain that something is seriously amiss. A team of researchers led by Robert Xiao, a PhD candidate at Carnegie Mellon who was doing some work with Microsoft, have developed a solution, called SparseLightVR.

Before diving into how SparseLightVR works, know this; a human’s peripheral vision has a much lower fidelity, “resolution” if you will, than their central vision. That’s what makes VR tolerable at all and is played on by SparseLightVR to achieve its increased immersion and decreased nausea. On paper, the fix is pretty simple. 80 small LEDs, planted at various locations in the periphery view of a VR headset, where there would normally be black space. These LEDs are fed an approximation of the edges of the VR user’s in-application view. Rather than expanding the field of vision, something even the most expensive VR sets still grapple with design-wise, SparseLightVR uses the LEDs to produce an extremely low-resolution replica of what should be at the edges of a user’s vision. Because the peripheral vision is so much lower quality than normal vision, the brain buys it and the illusion of full immersion is complete.

When Xiao came up with the idea, he actually expected it to make things worse, since it caused a bigger divide between what the brain and body were experiencing. As it turned out, playing to mind over matter worked perfectly to alleviate VR discomfort and increase immersion. While eye-tracking systems are still in the works, VR in its current form and future similar iterations can benefit immensely from the inclusion of this technology. While an eyes-only glance toward the source of a noise in a VR game may be a real immersion breaker, the system works well enough to fool your brain for as long as you’re looking dead ahead. Coupled with eye-tracking technology, however, this goes from an immersion saver and nausea killer to the difference between knowing you’re in VR and becoming wholly immersed in a virtual world. If the main view follows your eyes, and your peripheral vision’s urge to give you a small taste of what’s beyond your central view is sated, it’s not hard to imagine feeling like what you’re seeing is one hundred percent real, at least until you reach out to touch it. Despite what a certain Japanese anime that IBM based a VR pet project on may have you believe, the technology to patch up the touch end of things isn’t terribly far off and certainly won’t require trusting a VR headset with full access to your brain.

Technically speaking, we already have all of the technologies we need to make full-dive VR a reality, in a way, but not in a way that would be cohesive, cost-effective and, of course, feasible for everyday use. A firm dedicated to working on new ways for humans and computers to interface is already hard at work engineering vibration engines and electrical impulses that can simulate touching an object. VR gloves are already a thing. Simulating smells in VR is already well underway and even has its own poster child in the form of FeelReal, a VR helmet attachment that uses hot and cool air jets, water and even odor cartridges to realistically simulate smells. Naturally, taste simulation likely isn’t far off. While all of this tech, combined with the above, will make VR crazy immersive, you still run the risk of giving your roommate a black eye while aiming a virtual gun at a hyper-realistic alien. While that’s a difficulty with VR that we’re still miles off from solving, everything else is falling nicely into place.