Qualcomm products mentioned within this post are offered by
Qualcomm Technologies, Inc. and/or its subsidiaries.
Co-written by Jonathan Wicks and Sam Holmes
Virtual reality (VR) headsets immerse users by bringing the screen directly to the user’s eyes as two separate displays, one for each eye.
Developers can use foveated rendering to optimize rendering performance to meet the high-resolution and framerate demands of VR. Foveated rendering is a technique to reduce the pixel shading cost where high detail is not required as illustrated in Figure 1. This can reduce the amount of time and/or power required to render each frame.
Figure 1 - Conceptual diagram of a VR view using foveated rendering, where regions are rendered at different resolutions.
While foveated rendering is traditionally discussed in the context of eye tracking, it does not require it. The high field of view (FOV) and planar projections used in VR cause over-rendering in the periphery to achieve clarity in the center (see Figure 2). Additionally, barrel distortion, used by Asynchronous Time Warp to counter lens distortion, causes under sampling in the periphery. Fixed foveated rendering (FFR) is used in today’s VR standalone devices to minimize this over-rendering.
Figure 2 – Left: Visualization of over-rendering in the periphery versus under rendering in the center of the field of view. Right: Visualization of the barrel distortion effect.
The increased performance provided by foveated rendering provides developers the flexibility to increase the overall resolution, framerate, and/or change shading complexity, to provide better visual quality and a better user experience.
Support for Foveation
The Qualcomm® Adreno™ GPU found on Snapdragon® mobile platforms has supported foveated rendering since the release of our Snapdragon 821 mobile platform, first used in the Oculus Go. This functionality has evolved along with subsequent Snapdragon mobile platform releases including the Snapdragon 835 mobile platform used in the Oculus Quest, and the latest Snapdragon XR2 5G Platform (XR2), that powers the Oculus Quest 2.
We’ve made foveated rendering on Adreno available in OpenGL through the QCOM_texture_foveated extension, and in Vulkan through the VK_EXT_fragment_density_map extension. Our QCOM_texture_foveated2 extension, which builds on qcom_texture_foveated, adds the ability to discard regions of the screen.
The key functionality that makes efficient, high-performing foveated rendering possible is the Adreno GPU’s tile-based rendering approach. Tile-based foveated rendering divides the frame into tiles where each is rendered in sequence to high-speed memory on the GPU. Once the tile is completed, the final results are sent to system memory, and the GPU renders the next tile. This way, only the final frame contents need to be copied to system memory. The reading and rewriting operations performed by the application to construct the frame are done using the local high-speed memory. Below, we discuss evolution of tile-based foveated rendering on Adreno.
Standard Tile-based Foveated Rendering
Prior to our Snapdragon XR2 platform, standard tile-based foveated rendering was performed by rendering to a reduced resolution in high-speed GPU memory and then upscaling the results when a tile is copied to system memory as illustrated in Figure 3:
Figure 3 - Standard tile-based foveated rendering where the frame is divided into tiles, and each tile is rendered in sequence to high-speed memory on the GPU. The Final Frame shows what is rendered to the user.
Enhanced Tile-based Foveated Rendering
On the Snapdragon XR2 platform, we have improved foveated rendering through an enhanced tile-based foveated rendering process, resulting in higher quality, lower bandwidth usage, and higher performance. This improvement was achieved by storing only the low-resolution tile data in system memory and recreating the upscaled data only at sampling time as illustrated in Figure 4:
Figure 4 – Enhanced tile-based foveated rendering where the original low-resolution tile data is stored in system memory.
This subsampled layout reduces memory bandwidth and allows for selective upscaling, resulting in better visual quality. The developer just needs to enable the QCOM_texture_foveated_subsampled_layout extension in OpenGL or the subsampled capability in the VK_EXT_fragment_density_map extension in Vulkan.
Let’s review what these enhancements mean for foveated rendering:
Upscaling the low-resolution data only during sampling improves the quality of bilinear filtering, as neighboring pixels in system memory now contain true neighbor data since the rendering has not been previously upscaled. Furthermore, since the foveated textures are typically sampled by Asynchronous Time Warp, which also performs barrel distortion, upscaling only occurs exactly where it is needed for the display.
As resolutions continue to increase, bandwidth savings are becoming more and more important. This is especially true in standalone form factors where power efficiency is critical. Snapdragon XR2 foveation enhancements combat this by only writing out and reading in the required data to system memory. Storing the low-resolution tile data in memory reduces the write bandwidth during rendering and also reduces the read bandwidth when the texture is sampled. In XR content, we typically see write bandwidth savings of around 30% to 40%.
At its core foveated rendering is a performance enhancing feature, and Snapdragon XR2 substantially improves the performance when compared to the Snapdragon 835. In XR content we typically see 2-3x performance improvement. Depending on the level of foveation this can grow to as high as 4x. This is enabling VR application developers to achieve higher framerates and resolutions with Snapdragon XR2.
As eye tracking technology becomes more prevalent, further gains over FFR can be achieved by moving the focal point of the foveation to follow the fovea where we have highest resolution, and shading less in the peripheral areas where we see less detail.
Foveated Rendering for Engine Developers
Nowadays, many VR developers use Unity and Unreal Engine to build immersive VR environments, and Oculus has provided SDKs to perform FFR for both systems. These SDKs include APIs that fill in the foveation parameters so that developers can focus on solving high-level VR problems. For example, their OVRManager class for Unity allows developers to set FFR to several pre-defined levels of detail. For more information, check out Oculus’ Unity FFR and Unreal FFR documentation.
Developers interested in learning more about the Snapdragon XR2 platform used in the Oculus Quest 2 should check out the Snapdragon XR2 HMD Reference Design. Also, be sure to check out our How To Guide: Developing for Immersive Realities for additional tips on developing VR applications.
Snapdragon and Qualcomm Adreno are products of Qualcomm Technologies, Inc. and/or its subsidiaries