How To Guide: Developing for Immersive Realities

Tuesday 10/9/18 09:00am
|
Posted By Todd LeMoine
  • Up0
  • Down0

Snapdragon and Qualcomm branded products are products of
Qualcomm Technologies, Inc. and/or its subsidiaries.

In our recent blog XR1 Platform Makes it Real for Developers, we talked about high-quality immersive augmented reality (AR) and virtual reality (VR) use cases for the Snapdragon® XR1 Platform. Of course, developing for any “reality” takes more than just a great platform. For developers experienced in making traditional on-screen 3D environments where movements are controlled by peripherals (e.g., gamepads), XR, and VR in particular, presents unique challenges around performance, user experience (UX), thermal and battery life management, and debugging. In this blog we’ll explore these challenges to help you migrate to the world of immersive virtual reality development.

Rendering and Performance

If rendering a virtual world on a VR device sounds daunting, you’ll be relieved to know that most of your existing 3D rendering knowledge can be put to good use. Geometry and object culling, simplification of meshes, and avoidance of overdraw through mechanisms like depth and stencil buffers, are just some of the basic 3D rendering techniques that you can apply to VR.

The main differentiator between VR and 3D rendering is the device itself. A VR headset for example, has more responsibilities in a single package than a traditional flat screen. A VR headset is responsible for detecting the user’s movement (and often eye tracking), performing rendering, and ensuring that the two are coordinated to provide a realistic virtual view that mirrors the level of detail and reactiveness that a user would experience in the real world.

A VR device essentially creates a viewport around the user’s eyes, that has a relatively small view with the right levels of details in just the right places.

VR viewport resolutions

This is best accomplished by rendering varying levels of detail to different parts of the eye, a process called “foveated rendering”. Foveated rendering can be static, where the levels of detail for each area of the foveation remain fixed, or dynamic, where the level of detail changes (e.g., due to changes detected through eye tracking). See Qualcomm foveated extension in the Khronos registry.

On some systems this optimization can produce up to a 20% boost in rendering performance, and for some APIs, this many only require some tweaks to a few lines of code.

If you’re just getting started with VR on Snapdragon® mobile platforms, check out our Snapdragon VR SDK.

UX Design

High-quality immersive VR experiences can be intense–both for the user and the system. Generally, you should aim “bite-sized” experiences, with some developers citing five minutes as the ideal duration. Demanding scenarios are often best in small bursts, while less demanding scenarios allow the platform to thermally recover and users to relax a bit. To accommodate this your game/app designers should plan content, storylines, and the virtual environments around experiences of smaller durations.

Another major consideration is the mapping of large virtual spaces into available physical space. Unlike a game pad where movement control is constrained to a handheld device, movement in VR involves the user’s locomotion within a physical space that is likely smaller than the virtual space. Movement is captured both through various motion sensors and through motion capture. Platforms, like our Snapdragon 835 VR Dev Kit, can handle 6DoF movement covering both translations and rotations. With such freedom, it’s recommended that you start by warning users about such physical space constraints up front in your game or app.

There are also many options to handle the mapping, but a simple approach is to calculate the physical space available and then fade to black as the user approaches boundaries during navigation. You can also display on-screen elements (e.g., messages) indicating that the user has reached the boundary.

The final consideration involves motion sickness, which users may experience when the rendered view isn’t synchronized with the user’s movements. Rendering performance is therefore essential in VR development to ensure the comfort of your users, and it’s recommended that you aim for rendering performance of 90FPS or higher. Developing around the “bite-sized” experiences mentioned above can also help here.

Thermal and Battery

Battery consumption and the heat generated from rendering and compute operations are issues for all mobile and wearable devices, especially for a VR headset worn over the user’s face and eyes. It’s therefore critical that you monitor these aspects throughout the production cycle of your game/app. You can use the Snapdragon Profiler to analyze thermal data and other resources, such as GPU and CPU power frequencies which can be used to estimate power curves during program execution.

Debugging

Debugging a game/app on a device such as a VR headset may prevent you from viewing your development PC and accessing your keyboard to invoke debugging commands. Overcoming this will require some creative planning early in your project, so here’s a few ideas to get you started:

  • Add debugging information such as heads-up displays into the viewport. Note however that there is little “screen space” in something like a VR headset so overlays can be hard to read. However, simple metrics like FPS counters and temperature should display fine.
  • Use the real-time capture functionality of the Snapdragon Profiler to record metrics while you’re immersed in the game/app for later analysis.
  • Check with the third party/OEM manufacturer of your device to see what facilities they provide for debugging.
  • Build in alternative locomotion mechanics such as a game pad for use during development.
  • Build an additional viewport of the game/app that can be viewed on your development PC, that approximates the headset’s view. You can then use your alternative game mechanics and/or have another developer use the headset while you focus on debugging.
  • Add in “cheats” such as teleportation to another virtual space that allow you to quickly navigate to different parts of the environment during development.

Developing for a New Reality

QTI has many technologies incorporated into several popular VR headsets that are engineered to provide high-quality immersive VR experiences. Although developing a virtual reality game/app comes with its own unique challenges, developers can start with tools like the Qualcomm Technologies’ VR SDK and Snapdragon Profiler, apply their previous knowledge of 3D environments along with a bit of creativity, and start developing for a new reality.