Another decision that should be made before design and development is the type of reality in which the experience will be created since different UX rules apply for each. For example, training simulations could be created in either Augmented or Virtual Reality.
If in Augmented Reality, the user can walk around freely. In that case, you would want to provide information that adds insight while ensuring the user’s vision of the real world is not hindered by the interface. Whereas, with Virtual Reality, the user is “transported” to a different world and has to be confined to a limited amount of space where locomotion is handled in other ways.
Back in 1994, Paul Milgram et al. created the Reality-Virtuality Continuum in order to illustrate the extension of reality from completely real to completely virtual.
This is a simplified illustration that doesn’t get into all of the nuances of the space, but it is a good start for helping you understand the main differences between the different types of reality.
This complete spectrum of realities has been coined as Extended Reality, or XR. The X can also be used as a variable for whatever comes next. I’ll break these down starting with the Real Environment — or as the Hockey Hall of Fame refers to it — Real Reality.
Real Reality (RR)
This is the real, physical environment that you’re used to experiencing, but we are now interestingly — or maybe sadly — going to have to start distinguishing from the other levels of reality.
Augmented Reality (AR)
Augmented Reality overlays information onto the real, physical world. Example applications include wayfinding, where you overlay directions to a location onto the real world.
You can also look up additional information about something in the real world, such as seeing the 3D elevation of floor plans.
You could remotely work with others to show them how to do things, like with this app from Vuforia. It kind of works like FaceTime on iOS, but it lets you mark up the environment to give more clarity. A desktop version is also available for cross-platform remote collaboration.
Mobile AR will not be in scope for these best practices since the interactions are still performed on a mobile device’s 2D screen. You can refer to the HMI guidelines for those devices if you would like to learn more.
Don’t confuse AR Glasses with Smart Glasses
There are a few products on the market that can easily be confused with Augmented Reality, but more accurately fall into the category of Smart Glasses. These Smart Glasses allow you to pull up information onto a small 2D screen within a head-mounted display as reference, thus occluding the world instead of augmenting it. While they still have very useful applications, they do not overlay digital information onto the real world in 3D space so don’t truly fall within the Augmented Reality spectrum.
AR Glasses are still a work in progress
The Tilt Five AR glasses targets the tabletop gaming fanbase since it can add more drama to your traditional sessions with special effects and 3D world maps. You can also play remotely with friends as long as they have a pair of glasses.
Although the marketing has mainly been gaming so far, they do provide an SDK (software development kit), which will easily open up the use cases to other industries. They have successfully completed their Kickstarter campaign and are now in production and pre-orders are being taken.
And Apple has long been working on an AR headset as can be seen with their patent filings and company purchases. Tech enthusiasts had been hoping for an announcement to come out about these sometime within the next year, but due to the current climate, there may be a delay.
Mixed Reality (MR)
On this Reality-Virtuality diagram there are various levels of augmentation, which range from traditional information overlays of Augmented Reality to environments that intermix the real world with life-sized digital objects. This range of augmentation is known as Mixed Reality.
Mixed reality can increase visualization and collaboration with people in the room, or with people around the world. An example of this is the Spatial app, which is in early phases and can be tried for free to individuals as of the writing of this article.
Augmented Virtuality (AV)
And as you move further across the Mixed Reality gradient, the levels of the real physical environment versus the digital environment begin to shift and switch to become more of an Augmented Virtuality where your digital reality is then being augmented with the real world.
An example of this would be the Leap Motion, which uses objects in the real world, such as your hands, and creates virtual representations of them so that you can use natural gestures in-world.
Oculus has also released the beta version of their hand tracking technology on the Quest, which works similarly to Leap Motion. However, instead of having to attach an external device, the Quest uses existing inside-out tracking. They recently released their design guidelines, which you can check out on their developer portal.
Currently there aren’t very many examples of Augmented Virtuality available, but as the technology advances you’ll start to see more arise.
Virtual Reality (VR)
And finally, Virtual Reality fully transports and immerses you into a completely digital world.
A great example of this is Unreal Engine VR Mode, which makes you feel like you’re actually there. You can walk around interact with the characters and objects in that world.
Increasing levels of immersion into these virtual worlds is made possible through the creative use of haptic devices and carefully placed physical props. Those levels of immersion are only going to keep getting more realistic as time goes by.
If you enjoy these articles, consider supporting me on Patreon.
I’m an Immersive Tech UX Design Professional with over 22 years of experience designing for kiosks, websites, mobile apps and desktop software for many well-known and not-so-well-known companies.
I’m not speaking on behalf of or representing any company. These are my personal thoughts, experiences and opinions.