I have got a confession – I don’t enjoy being a passenger. I get mismatched sensory conflict – a motion sickness scenario that manifests for me on public transport. My body triggers something in my brain that says something isn’t right, and the motion it sees and feels doesn’t match my movement. Interestingly, this doesn’t happen when I’m in control of the motion, such as driving, or when external stimulations don’t cause vibrations, such as on high-speed trains or airplanes. However, if I have a choice, I try to cycle instead.
Sensory conflict also happens to be an issue in technology. A big issue.
Virtual reality (VR) and, to a lesser extent, augmented reality (AR) has problems with motion sickness. In the VR world, this phenomenon is known as cybersickness, a more comfortable term for users. However, the effect remains the same: over 180 research papers have reviewed this effect, years before metaverses became a marketing reality. This potentially impedes a technology consistently highlighted in future-tech hype forecasts. The potential applications of virtual reality technology extend beyond entertainment to healthcare, industrial training, simulation, and the whole range of futuristic utopia visions.
Ordinary VR applications like watching virtual TV screens and playing chess are usually manageable. But the world of VR is full of sensory derivations from real-world usage – riding a rollercoaster, and transport simulators with everything from bikes, cars, and airplanes to wheelchair simulators. One other common method of movement is teleportation – this can be when you go from one frame of reference to a completely new one. This teleportation is acceptable to our brains in gaming when we are sitting looking at a 2D render of a 3D world. When teleporting while standing in a fully immersive environment, it is important to note that our brains have a hard time dealing with instantaneous movement that does not include any sense of feedback from our legs.
Meta and Microsoft avatars have no legs because of equality of access (everyone is the same height in the metaverse) and rendering issues (headsets cannot see your legs to simulate their movement). But primarily, it is more comfortable for users to think they are floating than to try and coordinate to the character’s bounce with the assumption of the brain that you should be bouncing another way.
“In real life when you look down, you’re used to seeing your legs are at a specific distance away from your face, which you’re used to. But in the virtual world, if that situation is not replicated, it could cause you to feel nauseous. So, until there are better sensors to avoid this on headsets, companies might avoid creating legs.” – Dr.Rolf Illenberger, VRdirect
A feeling of nausea and oculomotor (usually tonic vergence – not being able to revert from squinting inwards when focusing) are common reactions to this experience. Disorientation, like the feeling one gets when coming off a spinning carnival ride, is also common. Studies have shown that a simulated sickness quotient (SSQ), which can be measured subjectively and objectively, is a good indicator of the likelihood that an individual will experience illness. Yet, the same studies show that the effect is reduced with technological improvement. So, are we eventually going to overcome this issue?
The latest versions of the most popular consumer VR solutions – HTC’s Vive, Oculus Quest 2, and PlayStation VR, all include improvements to counter the stimuli for cybersickness. The research is still not 100% cybersickness can be entirely prevented. These improvements are all along the usual Moore’s law style evolutions in technology in terms of chipset speeds, display resolution and sensor response. However, the primary method of alleviating this issue is improved VR content. Most rollercoaster-style movement VR applications now include artifacts that anyone who has spent time on a seabound vessel will understand – fixed objects in your line of sight and virtual helmets to frame your perspective. And these still cause problems for a high number of individuals with more sensitivity to sensory conflict. Incidentally, this was also a gender problem, as studies showed females were significantly more at risk of cybersickness than males. However, further investigations highlighted this was due to headsets being designed for male heads and not adjustable to fit diverse interpupillary distances (IPDs). Thankfully, all headset designers have reengineered to improve accessibility.
Ultra-high-resolution models are pushing the boundary for response times to remove this impact. One manner is to ensure that there is no uncanny valley in the visualization – a wider field of vision, full frame bionic displays with human eye level resolution. The next improvement is the content adoption to head and body movement with 100Hz+ refresh rates for lower latency. Other enhancements include foveated rendering to fully render every object in the scene (this is the cause of blurring at the edges of objects that are not prioritized). These advances require either edge processing or a local compute instance to make the necessary calculations before forwarding to the lower processing display headset. Achieving these results will require a dependable high-speed network connection. While it might be excusable to drop a frame in a local role-playing game, using VR headsets to dock an aircraft 30,000 ft up or perform surgery two floors down would be a big issue.
This brings us back to making VR accessible and achieving its mainstream promises. Will advances in headsets technology, content, and networking performance mean that artificial cybersickness can be overcome as a constraint to the continued use of VR? With each iteration of advances, we should reassess the impact on SSQ and aim to have an enjoyable VR experience every time anyone uses it. Of course, there should also be the option for thrill-seekers who enjoy feeling dizzy after walking on a plank on top of a skyscraper. In the meantime, I would rather take the bus.