Using Shaders for Realistic Graphics on Mobile

Tuesday 8/3/21 08:41am
|
Posted By Carlos A. Dominguez Caballero
  • Up0
  • Down0

Snapdragon and Qualcomm branded products are products of
Qualcomm Technologies, Inc. and/or its subsidiaries.

Realism in games is achieved by implementing effects like lighting and shadows, physics-based particle systems and water effects, and special effects like motion blur, bump mapping, and distortion. By aiming to perfect these on today’s GPU hardware, your game can become so realistic that players will have to do a double-take to determine if they’re looking at computer graphics or the real thing. Quite simply, they can seem like on-screen magic!

All of these effects are made possible by shaders. These small programs that run on the GPU provide developers with a programmable rendering pipeline to implement dynamic rendering effects at runtime.

Shaders 101
Writing a shader involves the use of a high-level shading language. Common examples include OpenGL Shading Language (GLSL) and DirectX High-Level Shader Language (HLSL), both of which abstract the hardware into high-level 2D and 3D-rendering constructs. Once implemented, shaders are typically compiled into a low-level shading language such as ARB assembly Language or DirectX Shader Assembly Language for optimization and execution on a given platform.

Shaders were first introduced in 1988 by Pixar as part of their RenderMan Interface Specification and quickly made fixed-function rendering pipelines obsolete. Since then, GPUs have evolved to include hardware-accelerated programmable shader processing capabilities to support a growing number of shader types, including:

  • Pixel Shader (aka Fragment Shader): computes the color of each pixel to render based on the specific effect (e.g., lighting, bump mapping, etc.) that the developer is trying to achieve.
  • Vertex Shader: transforms each vertex from 3D space to view frustum space for rasterization and ultimately for projection on-screen.
  • Geometry Shader: generates new primitives (e.g., triangles). This can be used for all sorts of procedural effects such as procedural vegetation, surface damage, particle effects, etc.
  • Tessellation Shader: subdivides meshes into finer-grained meshes. This can be used to dynamically adjust the level of detail.

Shaders form the basis of today’s dynamic/programmable rendering pipelines and are typically chained together such that the output of one shader becomes the input to another. The pipelines then allow multiple effects/types of processing to occur prior to final on-screen rendering.

Over time, the variety of shaders and their associated instruction sets expanded, as did the number of shaders supported by GPUs. With pixel and vertex shaders being the most common, the unified shader model was introduced, allowing any shader to handle both data types. By unifying the capabilities and instruction sets of different shaders, the GPU can now balance shader tasks if certain shaders sit idle.

Shaders often look like standard C programs. For example, they typically feature data structures for storing data like vectors or coordinates, a main function that is invoked by the GPU, and constructs for platform-specific resources like target arithmetic logic units (ALUs). Below is a small example of a pixel shader implemented in OpenGL ES taken from the Burning Flame using GPU example project on QDN:

#version 300 es
in vec2 TexCoords;
in vec4 ParticleColor;
out vec4 color;
uniform sampler2D sprite;
void main()
{
 #if 1
  color = (texture(sprite, TexCoords) * ParticleColor);
 #else
  color = ParticleColor;
 #endif
}

This is a great hello world example for shaders because it shows a couple of one-liners for setting the color of a pixel. In the first case, the shader sets the color of a pixel by pulling it from a specific texture coordinate and modifying the value based on some particle color. In the second case, it simply assigns the particle’s color to the pixel’s color.

Shaders on Adreno
Shaders continue to evolve and play a key role in today’s cutting-edge mobile GPUs, including the Qualcomm® Adreno™ GPU found on our Snapdragon® mobile platforms.

Starting with the Adreno 2xx series, we’ve included shader support in the Adreno using a unified shader architecture encompassing pixel and vertex shaders. By processing both vertices and pixels in groups of four as a vector, the Adreno can balance both pixel and vertex shader loads across its ALUs as shown in Figure 1.

Figure 1 – Conceptual illustration depicting how the Adreno’s unified shader architecture can balance both vertex and pixel shaders across its ALUs

Starting with Adreno 3xx, vertex, pixel shaders, and general GPU processing can also share the same ALU and fetch resources. For example, the vertex shaders have direct access to the GPUs texture cache, making it relatively straightforward to implement vertex texture algorithms (e.g., displacement mapping).

Adreno 3xx’s multi-threaded architecture means that if a shader is stalled (e.g., due to a texture fetch), execution can be passed from one to another. Adreno 3xx also supports scalars, which means that pixel shaders running on Adreno using 16-bit floating point values can provide up to a two times power and performance improvement versus other mobile GPUs based on a vector architecture. The best part is that this is handled automatically by the Adreno GPU, leaving developers free to focus on writing shader logic.

Our latest revisions of Snapdragon, including the Snapdragon 888 5G Mobile Platform and Snapdragon 865 5G Mobile Platform, support variable rate shading (VRS). VRS allows developers to control the granularity at which a pixel shader is executed to color one or more pixels at a time. Using VRS, programmers can control how many pixels to shade for a given area of a scene based on its level of detail, thus helping to reduce processing power and power consumption.

Mobile developers building games and graphics apps for Snapdragon have a rich selection of SDKs to choose from because the Adreno GPU supports OpenGL ES, Vulkan, and DirectX. In addition, many mobile game developers also use third-party game engines such as Unity, and Unreal Engine. Both of these engines expose mechanisms for developers to create shaders within their ecosystem by means of additional abstractions to the shading language which help portability to various platforms.

Ready for more Realism?
Developers interested in learning more should download the Adreno SDK for various demos and samples of common shaders and shading techniques as well as visit the Game Developer Guides for more in depth recommendations and guidance on creating best-in-class experiences on Snapdragon platforms. For more shader-specific information visit the following sections:

Be sure to check out the SDK’s /Developer/Assets/Samples/Shaders directory that includes a number of shaders written in both GLSL and HLSL. As well, the Burning Flame using GPU example project on QDN, incorporates a vertex and pixel shader.


Snapdragon and Qualcomm Adreno are products of Qualcomm Technologies, Inc. and/or its subsidiaries.