Tips for Designing for VR & AR

I’ve put some tips together for anyone designing an AR or VR application. This is aimed at people who may have limited experience with VR and AR.

1) Understand why you're developing for VR & AR

Any app can be built for VR and AR, but does it make sense to do it? VR and AR allow natural viewing and interaction (to an extent) with digital experiences, but what else do these types of devices bring to the table?


VR brings presence to the table. Instead of observing an experience, you’re inside the experience.

Presence heightens emotional reactions. A boulder rolling towards a character on TV feels mundane compared to one rolling towards you in VR. Scale has meaning - seeing a giant on a TV screen isn’t the same as looking up and seeing a giant towering over you.

AudioShield and BeatSabre are great examples of this. The games look mundane on a 2D screen. But when you are in there, they are engaging.

For more on presence see:


AR devices allow augmenting reality with 3D (or 2D) graphics. Virtual objects can be interacted with like physical ones. AR devices can be used in collaborative environments, and can be used on the move. Virtual objects can interact with physical ones.

#2) Understand your hardware

VR Hardware

In order to make good experiences it’s essential to have good knowledge of the capabilities and limitations of VR & AR hardware.

Here’s a checklist of things you should be looking for:

  • Resolution: What is the minimum size text can be before it’s unreadable? The pixel density of most current devices do not support small text.
  • Positional Tracking: Does the device know the position of the users head, or only its orientation?
  • Spatial Anchoring: Does the device recognize where the user currently is? Does it allow virtual objects to be anchored to physical places?
  • 3D Capabilities: Google Cardboard, Hololens and other mobile devices have a limitation of how much they can display before the frame rate starts to drop.
  • Field of View: If the field of view is narrow, then you will likely need a strategy that will allow the user to locate the virtual objects.
  • Controllers: How many buttons do the controllers have? Which are delegated to the OS (and unusable in applications?)? Do they have any type of haptics?
  • Occlusion (AR): Can static geometry occlude virtual objects (example: if a virtual dog is behind a physical desk would it be drawn or not?)

#3) Know what not to do in VR/AR There are some things that you (almost™) definitely should not do. If you have experienced them, it’s easy to understand why. Here’s a shortlist:

Don’t stick a HUD to the users face (VR)

You might have thought about adding a HUD to your app, Iron Man style:

Iron Man Hud

It sounds great in theory, but problematic in practice. Users will initially have the impulse to turn their head to look at the UI, but that won’t work since it’s fixed relative to the head. So your users need to rotate their eyes, which can be uncomfortable.

See Designing a HUD for a Third-Person VR Game

Additionally, many VR devices have lower resolution around the edges of the display due to the way lenses work.


  • Have your UI be objects part of the world (not attached to the user in any way)
  • Attach UI to the users hands

Tilt Brush UI In tilt brush you have access to a wide variety of controls on your left hand, which you select with your right hand.

##Don’t draw virtual environments on AR hardware

Current AR hardware isn’t suited to drawing environments:

  • They can’t draw opaque graphics, so your virtual environment will be laid over your physical one, which is visually confusing.
  • The field of view of current AR hardware is limited, so you never get the sense you’re actually inside a virtual environment.

I’m hopeful that there will eventually be dual mode HMDs that support AR and VR modes equally well, but for the meantime, if you need a virtual environment stick with VR devices.

Don’t jerk the user around (VR)

VR motion sickness is a real thing. The best way to induce it is to quickly (and frequently) accelerate the user (the virtual camera) around in your world. It’s caused by a mismatch of what our eyes are telling us (“we’re moving!”) to what our vestibular system is telling us (“we’re not moving”). Some people do adapt to this - they get their “VR Legs”. But in general you should try to avoid it where possible. If you review feedback on Steam VR games you’ll discover a number of games are unplayable by a significant proportion of the population due to this.

So if we can’t move the user around…how do we work around it? Here are some ideas:

  • Physical Movement: Allow the user to physically move around instead (any AR headset or VR headset with positional tracking supports this).
  • Teleportation: Many experiences have a laser pointer teleportation method.
  • Artificial locomotion: There are some clever mechanisms to allow the user to navigate a large virtual space in a small physical space. A very interesting example of this is one that rotates the environment during saccadic motion of your eyes (but this requires special hardware).
  • Space Folding: Some games and experiences have Escherian geometry. You look around a corner, and there’s another space behind it that couldn’t possibly be there, in a way that allows you to make the best use of limited physical space. Unseen Diplomacy does this (youtube).
  • Move the space: Instead of moving the user around, allow the user to move the space. This works best if there’s something that’s static relative to the user, and the movement is directly controlled by the user.