Twenty Milliseconds Logo

This is Twenty Milliseconds, a site documenting what works and what doesn't in virtual reality design.

Constraints of Writing Software for Full World Simulation

Summary: Applications that do not follow VR best practices will make users nauseous or worse. Applications must be responsive to head movements at all times, avoid unrealistic movement of the world, and render the world with extremely low latency.

When designing software for a phone or a tablet, in the worst case scenario, the user would leave your site, or quit your app. In an immersive virtual reality environment, the stakes are much higher: a poorly architected application can leave a user feeling nauseous, disoriented, or worse. Here are some quotes from people who have tried the Oculus:

Even on day 3 after 5 mins of playing Half Life 2 (which is giving me the best experience so far) I can’t use the [Oculus] Rift for the rest of the day because I’m feeling pretty dizzy. (link)

… my pounding headache remained for a few hours more… Half the people at my [Oculus demo] felt equally ill following their hands-on and at least a couple other journalist friends I spoke to had the same experience. (link)

I tried Borderlands 2 and [was] in for a mindblowing ride. The 3D, awesome. I was really enjoying every minute in the game. Minute, thats it. The feeling of vertigo and motion sickness was overwhelming. (link)

Not to mention physical damage from losing track of your physical surroundings:

I moved my head far over and down to my left to “examine” the scene… What I couldn’t see was the wall that was to my left… The headset crunched on the wall and smacked down on the bridge of my nose. Blood and pain followed. (link)

Ordinarily, software architecture would be outside the scope of a usability website; it would be like an engineer telling a salesperson how to sell to clients, or recruiting giving tips on how to architect the app. But to focus on the design of an application that makes it “usable” without addressing the factors that cause nausea and physical damage would be a recipe for a consumer disaster.

As a developer, the Oculus VR Best Practices Guide should be your bible. It contains the collected wisdom of the Oculus team, from their own testing and from the reports of assorted developers using the platform. We will dive further into the guide in future posts, but for the moment let’s just consider the problems that an immersive headset presents for developers.

When your head moves, the world must move

Users have reported feeling disoriented if the world does not move when your head moves, or if the image lags behind your head motion. A number of points in the guide deal with the picture on screen not updating when the user’s head moves. In addition, placing an object in a fixed position in the field of view causes disorientation. A menu or splash page which stays fixed on the screen when users move their head can cause dizziness.

This also means that every scene in your application must be rendered in three dimensions. No cutting corners with a cut scene drawn from a single viewpoint; if a user moves their head during that time and your software doesn’t update, dizziness is in order.

On a similar note, you lose the ability to move the world if a user’s head moves beyond the edges of the camera’s ability to track it. In this situation Oculus recommends you fade the contrast, or the brightness, to avoid a situation where the user’s head continues to move without the world updating. I will cover this topic more in a future post.

Audio and video stimulus should match real life

Thinking about how to output sounds makes it clear how much complexity Oculus adds to the average application. Compare the outputs from different head positions, when you are wearing headphones or listening to speakers.

This means applications need to take the type of speaker system into consideration and filter sound according to the user’s current head position.

The guide has more recommendations here, about avoiding situations that don’t happen to your head in normal situations. The world shouldn’t move much more quickly than you walk (about 1.4 meters per second). Jump cuts (the kind you see in movies, where the viewpoint shifts from one person’s face to another), or zooming, are disorienting.

Speed kills (again)

On the Web and for mobile devices, we saw time and time again how important it was to have a fast application. A 100ms delay at Amazon cost them 1% of sales (hundreds of millions of dollars last year). When Google slowed down search traffic 500 milliseconds, they lost 20% of sales. On phones, download size (and resulting speed) speed is also a primary consideration. Connections from phones to the network are often awful and slow, and users give up if their tasks won’t complete. “Mobile experiences fill gaps while we wait,” said Mikey Krieger, Instagram co-founder. “We don’t want to wait while we wait.”

In general a response time of 100 milliseconds on the Web or a phone was “good enough” for users to feel like interfaces were responsive and they were manipulating the elements in the UI directly. Virtual Reality interfaces must respond within 20 milliseconds, a 500% reduction in acceptable response time. Furthermore, VR must often update large portions of the UI in that time in response to head movements, not just a button or a dialog.

We’re entering a situation not unlike the Internet in 1997, where network latency and request size dominated design considerations. As a community we’ll need to throw away a lot of the rich applications and interactions that have been made possible by faster browsers, faster machines and faster Internet connections. It will be an exciting ride.