Designing interaction elements for virtual 3D space has a unique set of challenges different than those of designing for traditional flat 2D screens that have a set frame as a boundary. This means that instead of just cramming traditional UIs onto 2D panels, we need to think spatially – using the real world as our inspiration for how we interact with objects in virtual worlds. 

TL;DR Do’s and Don’ts


  • Keep the main interactions and critical elements in the viewer’s line of site in most cases so that they are easily noticed and physically comfortable.
  • Make use of the 3D space when placing UI elements within your experience.


  • Don’t take an existing 2D experience and convert it to XR with little or no thought to the user experience, performance, latency, controller inputs, onboarding, physical factors, motion, or legibility requirements necessary for a comfortable and compelling experience in XR.
  • Don’t use a traditional 2D gaming HUD in a VR experience since it can block your view of the environment and cause visibility issues.

Spatial diegetic UI and the return of skeuomorphism

YouTube player
First Contact by Oculus

For spatial design in 3D environments, 3D objects as UI elements have a better affordance than 2D objects placed in 3D space. People have the expectation that spatial objects would behave as such.

The original intention of skeuomorphic design was to help people understand how to interact with elements in a new platform or technology. Even if your UI elements are not literal representations of real life, the basic principles behind the design could still carry over. For example, 3D buttons compress and make a sound when pushed. 

UI elements don’t have to be in 3D

Similar to signs on freeways, the UI can be flat, but it must be an object within the 3D space. That doesn’t mean just slapping 2D interfaces from another platform into the 3D space, though. That would be considered poor design that doesn’t take advantage of the possibilities of 3D space – and is not conducive to a fully immersive experience for the platform. We can learn from history that lifting and shifting an interface to a new platform doesn’t work so well if we take a look at the transition of print design to desktop web and then to mobile web.

Avoid HUDs in VR

game hud overlay of Final Fantasy 14
Final Fantasy 14 by Square Enix

In the world of game design, important information is usually displayed in a “HUD-like” interface that is dispersed around the outer edges of the 2D screen. However, with VR that screen frame no longer exists. Instead, people are wearing a head-mounted display with head tracking enabled, allowing the user to look around in any direction as they would in the real world.

Some VR developers believe that this problem can easily be resolved by creating a virtual HUD similar to what you would see in Iron Man, but this design usually doesn’t work well in VR spaces since it gives you the urge to shake it off or swat it away like an annoying fly. It also can block the person’s view of the environment. Therefore, it is best to design information in-world.

But if you do choose to design a HUD for your VR application, Oculus has some good guidelines on what not to do.

It would also be a good idea to follow the guidelines for AR and MR, since the principles would be the same, even though the HUD is in a virtual world.

Pay attention to placement

Keep the main interactions and critical elements in the viewer’s line of site in most cases so that they are easily noticed and physically comfortable.

Diagram from article entitled “What are survey accurate visual simulations?” by buildmedia

Consider how this effects the experience you’re creating and whether it meets the goals of the application. Sometimes, it will still be necessary to have elements outside of immediate view. In this case, haptics, spatial audio and gaze cues can be used to direct their attention to the the elements you want them to notice. 

Guide people where you want them to look 

With VR, you can’t control where people will look at any given time, but you can track where they are looking. So, if there is a specific area or element you want them to pay attention to, this can be accomplished by providing gaze cues such as arrows and animation, as well as haptics and spatial audio. For example, if their attention needs to be on the controller, a combination of spatially placed audio and haptics from the controller can be used to get people to look in that direction. 

Examples to try

YouTube player

Job Simulator

This game uses skeuomorphic design elements throughout, including eating a burrito to exit the game instead of using a traditional UI menu.

YouTube player

Beat Saber

The scoreboard and status UI elements are placed within the virtual space within your line of sight, while still remaining out of the way of the main task you’re trying to accomplish. They don’t occlude (block) any important elements, such as the boxes you’re trying to hit or the walls you’re trying to dodge. Also, pay attention to the different treatments used when playing in 360 mode versus normal mode.

Learn more

If you enjoy these articles, consider supporting me on Patreon.

I’m an Immersive Tech UX Design Professional with over 22 years of experience designing for kiosks, websites, mobile apps and desktop software for many well-known and not-so-well-known companies.

I’m not speaking on behalf of or representing any company. These are my personal thoughts, experiences and opinions.

Learn more about the UX best practices

Content coming soon

Content is being added regularly, so check back again.

Dive a little deeper with video lessons

Content coming soon

Content is being added regularly, so check back again.

See how others have solved this problem

Content coming soon

Content is being added regularly, so check back again.