A Virtual Boost in VR Rendering Performance with Synchronous Space Warp

Wednesday 9/21/22 03:09pm
|
Posted By Jonathan Wicks
  • Up0
  • Down0

Snapdragon and Qualcomm branded products are products of
Qualcomm Technologies, Inc. and/or its subsidiaries.

Co-written by Jonathan Wicks and Sam Holmes

Rendering in VR demands that hardware and applications maintain very high frame rates. A typical PCVR (PC and VR) setup comprises a PC connected to a head-mounted device (HMD) and a pair of hand-held controllers that must all function in real-time. This setup must contend with fluid controller and game movements, 6DoF animations, head movements, and two render passes (one per eye) at 90 to 120 FPS. Switch this setup to a wireless HMD, and the communications channel (e.g., Wi-Fi, 5G, etc.) must also be up to the task of real-time data transfer.

Last year we collaborated with Guy Godin, creator of Virtual Desktop, to enhance PCVR rendering performance. We added Space Warp functionality to our Adreno Motion Engine which runs on all headsets powered by our Snapdragon XR2 and its Qualcomm Adreno GPU.

Space Warp produces missing frames just-in-time on the HMD with no PC overhead, thus reducing PC-to-HMD bandwidth and stress on the encoder. This doubles the available PC render time and the effective encoder bitrate for PCVR-to-HMD streaming.

Let’s take a closer look at how this works.

XR Frame Extrapolation
High-performance rendering happens when the application’s render rate matches the display’s hardware display refresh rate. One method to achieve this is Asynchronous Time Warp (ATW) which corrects for subtle differences in head rotations between the render and current pose. It also ensures low-latency head movements are reflected on the display (aka low motion-to-photon latency), which is important for preventing cybersickness. However, traditional ATW can only correct for head rotations and cannot account for game animations, controller motion, 6DoF movements (e.g., translation/walking in the scene), or game animations. When the render rate doesn’t match the display rate, frame judder can manifest itself as double images, ghosting, or blurring (see below).

Example of judder in VR when the application render rate doesn’t match the display rate.

Overcoming Judder with Space Warp
Judder can be addressed using Space Warp – a technique that fills in missing frames. There are several variations of Space Warp:

These techniques produce a new frame representing what the missing frame would have looked like had it been rendered, to prevent rendering a duplicated frame. While similar to video frame interpolation, which produces an intermediate frame between a past and future frame, VR produces a future frame extrapolated from only the previously rendered frames.

Frame Extrapolation
When the display rate of the XR device is 90 FPS, it leaves less than 11ms for the application rendering to complete. At 120 FPS, the application rendering must complete in less than 8ms. See below for the ideal scenario where the rendering and display rates match:

PC frames available at HMD refresh rate.

In a wireless PCVR split-rendering setup, the demands of the game may overwhelm the capabilities of the PC, causing the render and display rates to become mismatched. In this scenario, Space Warp can enable someone to play a game at a higher refresh rate than they may otherwise be able to play.

For example, at a VR HMD display rate of 120 FPS, with Space Warp, the game can render at 60 FPS, doubling the time the PC has available to complete the frame. Space Warp on the HMD then fills in every other missing frame by extrapolating the display frame rate, thereby smoothing out the judder and providing a perceived 120 FPS. Rendering at half rate also reduces the pressure on the video encoder and the network by sending fewer but larger frames, providing twice the available bitrate to the HMD for the same bandwidth. This extrapolation process is shown below.

Extrapolated frame produced when the PC cannot meet the HMD refresh rate.

Enabling Synchronous Space Warp in Virtual Desktop
Enabling Synchronous Space Warp in Virtual Desktop required the creation of a new interface in the Adreno Motion Engine that takes two color input frames, the rendered poses, and produces a new extrapolated frame for the target pose. The results are shown below.

Left: Rendering without SSW; Right: Smoother rendering thanks to SSW via Adreno Motion Engine

Below is an overview of how this works for PCVR-to-HMD rendering:

Overview PCVR to HMD rendering. Left: PCVR system. Right: VR HMD.

  1. The Virtual Desktop client application installed on the HMD reads the pose and sends it to the PC to produce the frame.
  2. The PC renders the requested frame which is then encoded and sent over Wi-Fi to the HMD.
  3. The HMD client application decodes the frame and submits it to the OpenXR runtime which composites it on the display for the user.

Note: When the HMD runs at 120 FPS, the PC has less than 8ms to render the application frame.

All these steps must be performed with the lowest possible latency to minimize the motion-to-photon latency (M2R2P).

SSW prevents judders in Virtual Desktop as follows:

  1. When Virtual Desktop detects that a new frame wasn’t received, it provides the last two real rendered frames to the Adreno Motion Engine along with their respective poses and the target output pose.
  2. The Adreno Motion Engine uses the last two frames, the render poses, and that target pose to produce an extrapolated frame.
  3. The Virtual Desktop client application passes this frame to OpenXR to avoid judder. Here, the PC only has to render at half rate since the Adreno Motion Engine produces every other frame. However, even though the PC can now run at half rate, the Virtual Desktop client application and the Adreno Motion Engine must still complete the entire extrapolation process in the original target frame rate. At 120 FPS, the Adreno Motion Engine must produce two new frames, one for each eye, all in under 7ms, while also leaving room for Time Warp to run.

SSW on the Adreno Motion Engine
The Adreno Motion Engine performs five operations to support SSW:

Adreno Motion Engine SSW operations performed just-in-time, per eye, at HMD refresh rate.

  1. Reprojection (shown in the first block of the Snapdragon component above): Two frames are rendered with different head poses on the PC. To accurately determine the application motion, the head pose deltas are removed and the field of view of the rendered frames is considered. Adreno Motion Engine reprojects the two input frames to the target output pose, similar to Time Warp.
  2. Scene Change Detection: Detects when input frames are disjoint and thus calculating a motion would not find any correlation, resulting in a poor extrapolated frame. This can occur in various scenarios such as snap turns where the current and previous frames may have totally different content. The Scene Change Detection block also detects fade transitions as well as menus and UIs, which are scenes that shouldn’t be extrapolated.
  3. Sub Pixel Motion Estimation: Once the two input frames are in the same target head orientation, motion estimation is performed to identify the animation deltas between the frames. The motion vectors are returned as 16-bit float values in X and Y. The sub-pixel resolution helps provide smooth transitions across gradient surfaces.
  4. Filtering and Smoothing: Before the raw motion vectors are used for extrapolation, they pass through this block which performs spatial filtering and smoothing. This helps to reduce outliers and provide smoothness to the field. Care is taken to avoid losing edge details, specifically around hands and controllers.
  5. Extrapolation: The last rendered frame and the generated motion vectors are used to extrapolate and produce a new frame a half step into the future. This block also applies the target head pose warping, which was taken out when generating the motion vectors.

Depending on the exposed OpenXR HMD capabilities, the final extrapolation step can be optionally offloaded to the OpenXR compositor, further freeing up the HMD resources.

Conclusion
A lot needs to happen in a short period to hit the high frame rates necessary for a great PCVR-to-HMD experience. While ideally, the PC and Wi-Fi can keep up with these demands, techniques like SSW are there when they cannot.

Adreno Motion Engine’s low latency, just-in-time frame extrapolation allows Virtual Desktop with Synchronous Space Warp to reduce video encoder and network stress, while doubling the available PC render time per frame. Since the entire workload occurs on an XR2-powered HMD, no additional PC or wireless headroom is required. This lowers the minimum PC spec required for Virtual Desktop end-users to enjoy their favorite VR game, judder-free, at the highest available framerates.

Interested in learning more about the Snapdragon XR2? Be sure to check out the Snapdragon XR2 HMD Reference Design.

For additional reading, check out our VR blogs and our Gaming Graphics blogs on QDN.


Snapdragon and Qualcomm Adreno are products of Qualcomm Technologies, Inc. and/or its subsidiaries.