Research - Surrey, British Columbia, Canada
Virtual reality software and hardware is becoming increasingly affordable and powerful, and is increasingly being used in experimental research. In fact, the possibility to conduct tightly controlled and repeatable experiments with naturalistic multi-modal stimuli in a closed action-perception loop suggest that VR could become an increasingly powerful yet flexible research tool.Despite increasing computational power and rendering quality, though, it is debatable whether humans necessarily perceive, feel, think, and behave similarly in real and virtual environments – which is essential for achieving sufficient real-world transfer of experimental results gained in the lab, and providing compelling experiences. What might be missing? What can we learn from this? How can we use this basic information to improve both technology and user experience?How might we be able to "cheat intelligently" in VR and, e.g., provide users with a compelling sensation of being in ("presence") and moving through ("vection") the simulated environments without the need for full physical locomotion or large costly motion simulators? Why is it so hard to control all 4 degrees of freedom when flying a quadcopter drone or through VR with the standard gamepad or RC controller? And how could we use our knowledge about human embodied perception to design more effective yet affordable locomotion interfaces for both 2D (ground-based) and 3D (flying), for both telepresence and immersive VR?