In order to avoid eye strain and headaches, it is important to pay attention to the placement of text elements in virtual 3D space. Font size, depth, contrast, spacing, density, lighting and many other things can affect the legibility of text and UI elements.
TL;DR Do’s and Don’ts
- Keep blocks of text short and simple.
- Follow real-world poster and signage print examples when determining text sizes, weights and spacing.
- Don’t place big chunks of text (large amounts of copy) in the environment.
- Don’t put large blocks of text (multiple paragraphs) closer than 1.5 meters in front of the viewer.
- Try to keep text at an optimal viewing distance of 2–3 meters from the viewer.
- 1.2 meters is the minimum recommended distance for text content, but could still cause eyestrain.
- Explore options for using foveated rendering to work around the limitations of the fixed focus displays currently used in XR headsets.
This will change as the technology advances
With the human eye, your lens shapes itself to focus on different objects while the rest of the world blurs. This helps your eyes focus on the targeted object while reducing the eyestrain.
As you get older, the lens hardens and makes it more difficult to quickly focus on objects that are too near or too far. When this happens, if something is too close to read it’s going to hurt your eyes. That’s why you see as people age, they have to start holding things out further to read them comfortably.
The current lenses in XR headsets are fixed focus — kind of like the aged eye. Only instead of taking a longer amount of time to shift focus, it’s unable to shift the focus at all. This causes eye strain as our actual eyes are working harder to try to shift focus on objects in digital space.
Current workarounds would be to use foveated rendering based on gaze targeting or eye tracking to digitally focus the object and artificially blur the world around that object. The problem is that there can be a delay in rendering, which can also cause eyestrain as the digital render can’t keep up with the eye’s speed of natural focus.
Tech companies, such as CREAL, are working on ways to simulate real-world depth of field in real time by tracking your eyes and digitally focusing on that area or object much like the human eye would naturally behave.
This will reduce the risk of eye strain, and loosen the current recommendations for comfortable viewing distances. The digital vision correction will also increase the scalability and affordability of XR eyewear since it would reduce the need for custom prescription lenses in each headset. The technology is still in the works, and is not currently on the market.
The text on the touch panel menu is close to you, but since it’s labeling for navigation, it’s short and concise and doesn’t require extended periods of reading. They also use a larger font with a heavier weight to further reduce eyestrain.
At the time of the writing of this article, this sim is still in early stages of release, and has a lot of potential for becoming an amazing experience. You can see the roadmap on their website. That said, there is room for improvement in eye comfort.
The font is a small, which means you need to have it closer to you to read. There is also a thin-weight font, resulting in line vibration due to the lower resolution of the current headsets. In addition to this, large quantities of text are crammed onto a small area with poorly defined margins and line spacing, making it harder to scan the content.
This would be better if standard HMI guidelines for iOS and Android smartphones and tablets were combined with foundational graphic design principles regarding bodies of copy, line weights, margins, and so forth. Also keep in mind the VR guidelines for minimum font sizes and smaller blocks of text content per “screen” for easier legibility.
- The VR Book, Chapter 8, Perceptual Modalities by Jason Jerald, PhD
- Vision by Oculus
- Oculus Rift 2: the VR Headset that never was by Andrew London, Michelle Fitzsimmons, Henry St Leger
- Hands-on: CREAL is Shrinking Its Light-field Display for AR & VR Headsets by Ben Lang
- Latency Requirements for Foveated Rendering in Virtual Reality by Rachel Albert, et. al.
- Towards foveated rendering for gaze-tracked virtual reality by Anjul Patney, et. al.
- Foveated Depth-of-Field Filtering in Head-Mounted Displays by Martin Weier, et. al.
If you enjoy these articles, consider supporting me on Patreon.
I’m an Immersive Tech UX Design Professional with over 22 years of experience designing for kiosks, websites, mobile apps and desktop software for many well-known and not-so-well-known companies.
I’m not speaking on behalf of or representing any company. These are my personal thoughts, experiences and opinions.
Learn more about the UX best practices
Dive a little deeper with video lessons
See how others have solved this problem
Content coming soon
Content is being added regularly, so check back again.