Twenty Milliseconds Logo

This is Twenty Milliseconds, a site documenting what works and what doesn't in virtual reality design.

Richard Yao - Human Visual System & the Rift

Reminder from Richard: you should read the Best Practices guide

Human Visual System

Visual perception is active - constantly perceiving, resolving objects

Rift provides all of your visual perception - big responsibility - have to consider the experineces you’re building for people

how we perceive:

images grouped into discrete objects

start to notice things when objects become ambiguous - illusions of duck vs/ rabbit, etc.

illusions are interesting, because we learn things when perceptions break down.

table illusion - 2 tables w/ same shape but look different sizes

change blindness - don’t notice when mountains change in background of city image. because there’s a gray image flashing in between, we don’t notice. if it flickers immediately we notice immediately.

lesson #1: perception is interpretation. a lot escapes awareness - stuff happening in your head/the world, you don’t know it or notice it. ‘metacognition’ - cognition about cognition

Bodily Sensory Information

lesson #2: small rendering issues become very very important

sensory disagreements

vection - illusory perception of self-motion from vision. “Vision tells you that you’re moving through space, body perceives you sitting in chair, this causes discomfort.”

avoid using head tilt to rotate in place - induces “coriolis effect” when your body rotates on one axis and your head is tilted. causes dizziness - same effect as dizzybat.

even in a perfect VR implementation, sensory disagreements will cause discomfort. camera movement is the primary cause. non-player-controlled camera movement is worse

acceleration is the main culprit of vection

More comfortable experience if you can prevent conflict between senses, vision

“Optic flow” causes vection - coherent movement in your visual field that suggests movement.

use cockpits

Keeping people engaged with a task, keeping mind occupied reduces vection.

Effortful re-interpretation: the cockpit is a frame of reference, reminds you that you’re sitting stationary in a chair.

testing

blocking off periphery in a demo. didn’t help much, still felt like you were moving through space.

2nd try - put it on a TV in a virtual room in the Oculus. much more effective tool at fighting vection

provide an “inertial reference frame”

player locked skybox - you flip the interpretation your mind defaults to. “I must be moving through the world” becomes “I must be sitting and the world is moving around me”

demo with background staying rock solid as player moves head - seems to help a little bit. might not make sense in different types of content. cons - not clear what the best practices are. what’s the best locked background, etc.

you can see the gridlines through the ground, it’s a little transparent

brain interpreted the gridlines as stuck inside a cage & flying around in a yellow cage.

challenges and opportunities

unprecedented control

content design - easy to make people very uncomfortable

creative solutions - yet to be discovered, things that haven’t been tried or things we don’t know that can hurt/help comfort.

Everything is still evolving… stay tuned.