Qualcomm True HDR

Introduction

HDR displays have been around for a long time in the PC and TV space. On the mobile side, OLED screens that have appeared in 2018 are starting to support a higher dynamic range and wider color gamut. To make better use of the screen’s wide color gamut and high dynamic range, Qualcomm launched True HDR gaming.

True HDR gaming is part of Snapdragon Elite Gaming, which was announced at the 2018 Snapdragon Tech Summit in Hawaii. This solution retains details and color to achieve the best user gaming experience. This is the first end-to-end HDR solution on mobile. True HDR gaming also fully takes advantage of Qualcomm Snapdragon platform GPU and DPU’s capability. Game Engine/GPU output the HDR content (RGBA1010102 format/BT2020 gamut) directly. Color Volume Mapping is performed by the Display Process Unit (DPU).

This guide is meant as an introduction and overview of the concepts related to HDR and present the Qualcomm True HDR gaming solution.

High Dynamic Range (HDR)

In the following image, the world’s luminance level is [0, infinite) candela per square meter(cd/m 2). Through adaption of rod and cone, the luminance human could perceive is around [10 −6, 10 8] cd/m 2.

Luminance

Real-world luminance levels and the high-level functionality of the Human Visual system, image taken from [1]_ Perceptual Design for High Dynamic Range Systems.

By comparing the supported luminance, the displayed devices could be classified as Standard Dynamic Range (SDR) and High Dynamic Range (HDR).

  • SDR support luminance in [0.0002, 100] cd/m 2

  • HDR support luminance in [0.00005,1000 or larger] cd/m 2

The figures below show the change of dynamic range and color gamut in the SDR and HDR pipeline.

SDR

Content delivery in SDR. Image Credit: Dolby Laboratories, Inc.

HDR

Content delivery in HDR. Image Credit: Dolby Laboratories, Inc.

Wide Color Gamut (WCG)

The human eye has three types of color sensors that respond to different ranges of wavelengths. The color consists of two parts: brightness and chromaticity. For example, the white and grey have same chromaticity and different brightness.

The International Commission on Illumination (CIE) created the CIE 1931 RGB color space. The horseshoe-shaped region represents what human could see in response to different light wavelength.

CIE 1931 color space

CIE 1931 color space.

The color gamut is a specific, complete subset of colors. It is often referred to as the entire range of colors available on a display device. The following image shows various standards’ color gamuts.

Chromaticity

BT.709, DCI-P3, BT.2020.

  • BT.709, high definition television (HDTV) standard

  • DCI-P3, used for cinema theater presentations and an intermediate gamut for UHD content

  • BT.2020, ultra-high-definition television (UHDTV) standard

Color gamut with larger range than BT.709, such as BT.2020, is often referred as Wide Color Gamut (WCG).

Color volume

Brightness and color gamut define color volume. A higher dynamic range and wider color gamut mean a larger color volume.

Chromaticity

The color volume of BT.2020 color gamut and BT.709 color gamut. Image credit: Sony.

Display technique

Display technology has evolved to increase bit widths, color gamut, and screen brightness.

  • Maximum bit width support includes 6, 8, 10, 12 bits.

  • Maximum luminance (brightness) support includes 100 nit, 500 nit, and 1000 nit.

  • Color gamut support includes BT.709, DCI-P3, and BT.2020.

Figure 7 The evolution of bit, brightness, color gamut that display devices support.

The evolution of bit, brightness, and color gamut suppoted by display devices.

Light-to-display overview

Figure 8 Environment photons to display photons.

Environment photons to display photons.

When a photo is taken using a camera and viewed, the followings happens:

  • The camera converts light to video signal using an Optical to Electrical Transfer Function (OETF).

  • The display converts video signal to light using the reverse function, Electrical to Optical Transfer Function (EOTF).

For SDR, gamma is used. For HDR, Perceptual Quantization (ST-2084) or Hybrid Log-Gamma (BBC / NHK) are used. For ST-2084, the following:

Figure 9 SMPTE ST 2084 (PQ).

SMPTE ST 2084 (PQ).

Figure 10 SMPTE ST 2084 PQ EOTF(PQ).

SMPTE ST 2084 PQ EOTF(PQ).

HDR10

HDR10 is an open standard (announced on August 27, 2015). This standard is composed of the following specifications:

  • Bit depth: 10 bits

  • Color representation: ITU-R BT.2020

    • Chromaticity

  • Electro-Optical Transfer Function (EOTF): SMPTE ST 2084

    • Map non-linear signal value into display light

  • Static metadata: SMPTE ST 2086

    • SMPTE ST2086 is “Master Display Color Volumes” metadata, used to describe the capability of the display used to generate content. It includes:

      • CIE(x,y) chromaticity coordinates for Color primaries

      • CIE(x,y) chromaticity coordinates for White point

      • Min and max luminance

      • Others such as MaxFALL (Maximum Frame Average Light Level) and MaxCLL (Maximum Content Light Level) static values

Color volume mapping

Color volume mapping is the process of mapping content that was created on a mastering display down to a playback display. It includes tone mapping for Luminance and gamut mapping for Chromaticity.

  • Tone mapping: The peak luminance of a playback HDR display will often be lower than the peak luminance of the reference display that is used to master HDR content. So tone mapping is necessary to convert the high luminance HDR content to suitable dynamic range that is supported by the playback display.

  • Gamut mapping: The primary colors of a playback HDR display are often less saturated than the reference HDR display. The wider content gamut must be mapped to the corresponding gamut of the playback display.

Note

When applying color volume mapping, the luminance and chromaticity attributes of both the reference display and the content must be passed to the display. These attributes are represented by the static metadata fields that are defined in the SMPTE ST 2086 standard.

Traditional Game HDR

In a comparison between the HDR with the SDR image, the traditional HDR image has more convincing visual effects.

Figure 11 sdr image(left) and traditional HDR image(right), images taken from Unreal.

sdr image(left) and traditional HDR image(right), images taken from Unreal.

Figure 12 Traditional HDR pipeline.

Traditional HDR pipeline.

A traditional game HDR pipeline takes the following steps:

  1. RenderedScene: A scene is rendered on Render-Targets with HDR format, such as RGBA16, R11G11B10, R10G10B10A2.

  2. Postprocessing: Bloom, vignette, grain jitter, motion blur, etc.

  3. ColorGrading: Contrast, saturation and gamma, gain, etc.

  4. ToneMapping: Maps the dynamic range data to standard range data (RGBA8888/sRGB). There are many traditional mapping curves:

Figure 13 Reinhard ToneMapper, Crytek ToneMapper, Filmic ToneMapper and ACESFilm.

True HDR

A True HDR image has a higher contrast. More details are preserved in highlights and shadowed areas. Qualcomm True HDR gaming is compatible with the HDR10 standard. In fact, the display process unit (DPU) expects HDR10 data input.

Figure 14 Traditional HDR image(up), True HDR image(down).

Traditional HDR image(up), True HDR image(down).

True HDR pipeline

HDR Game content is produced on a reference display with a higher dynamic range and a wider gamut. The rendered scene is mapped to the dynamic range and color space supported by True HDR in the postprocessing phase, and the display is notified by metadata.

Figure 15 workflow of The True HDR Pipeline for Game.

Workflow of a True HDR pipeline for Game

The following provides a detailed overview of the True HDR pipeline and how it functions:

  • HDR Content: Game content produced on a reference display with large color volume (higher dynamic range and wider color gamut).

  • Scene-referred image: Takes HDR content as input, adapts a realistic rendering approach such as physically based rendering (PBR) to generate a good HDR scene reference image with properexposure, rich highlights, and shadows.

  • Postprocessing, ColorGrading: Same as traditional game HDR.

  • GamutMap and Tonemapper: Maps the dynamic range and color space of the RenderedScene to the dynamic range and color space supported by True HDR.

  • UIToneMapper: Maps UI render data to HDR.

  • Adding a scale factor: Adds a scale factor to the UI to match with the main rendered scene.

  • Composite: Blends the scene HDR and UI HDR inputs. If the game relies on a lot of transparent UI, it would be easier to render the entire UI in a render target, and then blend with back-buffer, compared with direct render UI to back-buffer.

  • ST-2084-PQ(OETF): Data that True HDR requires the backbuffer to receive is ST-2084-PQEncoded. Therefore, the backbuffer would be encoded by ST-2084-PQ, and the encoded value is mapped to the range of [0.0,1.0].

  • EOTF/Tonemapping/GamutMapping: Handled by Qualcomm DPU.

    • ST-2084-PQ(EOTF): Inverse of ST-2084-PQ(OETF).

    • Tonemapping: Maps output of EOTF to the actual color space used by the screen and the actual dynamic range.

    • GamutMaping: Maps one color space to another. The usual method is linear stretch or compression. The color space required by HDR10 standard is BT2020, but the actual screen cannot meet this requirement. The Qualcomm DPU will perform the color gamut conversion.

Academy Color Encoding System (ACES) tonemapping

Even with HDR screens, the dynamic range of support is limited. Scene-referred images can generate very high dynamic range images. SceneToneMapper would map the input HDR scene (up to 10000 nit) to True HDR scene (1000 nit). UIToneMapper would map the UI data to True HDR data.

The Academy Color Encoding System (ACES) is often used in games.

ACES pipeline

There are two color gamuts in ACES: AP0 and AP1.

AP0 and AP1 contain BT.709 and BT.2020. This means that ACES can handle various types of tone mapping and gamut mapping.

Figure 16 Color gamut comparison among BT.2020, BT.709 and AP0/AP1 used by ACES

Color gamut comparison among BT.2020, BT.709 and AP0/AP1 used by ACES.

Figure 17 ACES pipeline.

ACES pipeline.

For games, images after scene rendering are a Scene-Referred image. In the ACES pipeline, there are only three components:

  • Color grading: Grading in ACES Color Space.

  • Reference Rendering Transform (RRT): Converts the scene-referred colorimetry to the display-referred. The color space is converted to the ACES defined AP0 color space. Dynamic range is converted to 0-10000 nits, so it allows for rendering to any output device.

Figure 18 RRT curve.

Figure 18 RRT curve.

  • Output Device Transform (ODT): Converts the color space into the ACES defined AP1 color space. Dynamic range is mapped to the dynamic range supported by the device. Converts AP1 to the color space supported by the device before output. There are many ODTs, such as ODT_48nits, ODT_1000nits, ODT_2000nits, etc.

Figure 19 RRT+ODT_1000 nits (left) and RRT+ODT_48 nits (right)

RRT+ODT_1000 nits (left) and RRT+ODT_48 nits (right)

True HDR code setup

To enable True HDR in OpenGL ES, the following extensions must be supported:

  • EGL_EXT_gl_colorspace_display_p3

  • EGL_EXT_gl_colorspace_bt2020_pq

  • EGL_EXT_surface_SMPTE2086_metadata

Note

Vulkan Swapchain/WSI for Android only supports VK_COLOR_SPACE_DISPLAY_P3_NONLINEAR_EXT. For additional information on how to enable this for Vulkan, check out the Enhancing graphics with wide color content guide from the Android Developers Documentation.

Set EGLSurface format

Set the EGLSurface format to R10G10B10A2:

EGLConfig EGLConfigList [1]; int ConfigAttributes [] = {
    EGL_RED_SIZE,10
    EGL_GREEN_SIZE,10
    EGL_BLUE_SIZE,10
    EGL_ALPHA_SIZE,2
    EGL_COLOR_COMPONENT_TYPE_EXT,
EGL_COLOR_COMPONENT_TYPE_FIXED_EXT, EGL_NONE};

eglChooseConfig (eglDisplay,ConfigAttributes,EGLConfigList,1, eglNumConfigs);

Set color space

Set the color space of eglWindowSurface to EGL_GL_COLORSPACE_BT2020_PQ_EXT.

EGLint attribs[] = {EGL_GL_COLORSPACE_KHR,EGL_GL_COLORSPACE_BT2020_PQ_EXT,EGL_NONE };
EGLSurface eglSurface=eglCreateWidowSurface(eglDisplay,eglConfigParam, InWindow, attribs);

Set metadata

Set the metadata attributes of eglSurface.

EGLint SurfaceAttribs [] = {
EGL_SMPTE2086_DISPLAY_PRIMARY_RX_EXT,       EGL_SMPTE2086_DISPLAY_PRIMARY_RY_EXT,
    EGL_SMPTE2086_DISPLAY_PRIMARY_GX_EXT,
    EGL_SMPTE2086_DISPLAY_PRIMARY_GY_EXT,
    EGL_SMPTE2086_DISPLAY_PRIMARY_BX_EXT,
    EGL_SMPTE2086_DISPLAY_PRIMARY_BY_EXT,
    EGL_SMPTE2086_WHITE_POINT_X_EXT,
    EGL_SMPTE2086_WHITE_POINT_Y_EXT,
    EGL_SMPTE2086_MAX_LUMINANCE_EXT,
    EGL_SMPTE2086_MIN_LUMINANCE_EXT
};
static const DisplayChromacities DisplayChromacityList [] = { {{0.70800f, 0.29200f, 0.17000f, 0.79700f, 0.13100f, 0.04600f, 0.31270f, 0.32900f}}, // DG_Rec2020 };
for (uint32_t i = 0; i < 8; i++)
{
    eglSurfaceAttrib(PImplData->eglDisplay,eglSurface, SurfaceAttribs[i],EGLint(DisplayChromacityList[0].ChromaVals[i]* EGL_METADATA_SCALING_EXT));
}

Get the luminance of display on Android

MaxAvergageLuminance:

https://developer.android.com/reference/android/view/Display.HdrCapabilities.html#getDesired MaxAverageLuminance() Maxluminance: https://developer.android.com/reference/android/view/Display.HdrCapabilities.html#getDesired MaxLuminance() MinLuminance: https://developer.android.com/reference/android/view/Display.HdrCapabilities.html#getDesired MinLuminance()