Snapdragon and Qualcomm branded products are products of
Qualcomm Technologies, Inc. and/or its subsidiaries.
In our recent blog post, The Architecture of Snapdragon Spaces, we discussed the features and benefits of our Snapdragon Spaces XR Developer Platform for developing headworn AR experiences. We also touched on an underlying foundation of Snapdragon Spaces: OpenXR.
Our SDKs for Unity and Unreal, along with our Snapdragon Spaces Services runtime APK (included in our SDK package), abstract away most of the details of OpenXR. Because of this, you can focus on developing the business logic for headworn AR experiences. However, it’s always good to understand what goes on behind the scenes. In this blog post, we delve a bit deeper into OpenXR and how Snapdragon Spaces works with Unity and Unreal for your AR apps.
The OpenXR API and Lifecycle
OpenXR is an industry-standard C API specification by the Khronos Group. It’s being adopted as an expedient way to quickly bring new hardware into the XR ecosystem. This allows companies and organizations to create implementations called OpenXR runtimes for specific hardware. In our case, Snapdragon Spaces’ runtime was designed to work across compatible hardware offerings which support headworn AR display(s). This runtime can also expose additional functionality through extensions to OpenXR. As new functionality is exposed in the hardware, new extensions will be added so users can take advantage of those features.
OpenXR’s API design provides and supports many programmatic constructs and concepts, some of which include:
- Instance: An object that enables communication between an app and an OpenXR runtime.
- Session: Representation of an AR application session that covers the lifecycle of an AR app – from creation through to final cleanup.
- Action: Movements and processes which users can perform (e.g., opening a menu).
- Interaction Profile: Physical input sources used to perform Actions. Interaction Profiles are mapped to Actions using Interaction Profile Bindings.
- Spaces: Frames of reference to map and track the real world (e.g., to map the location of virtual objects).
- View Configuration: A viewport to which images are rendered. Headworn units typically have two views (one per eye, sometimes utilizing foveated rendering), while a typical smartphone has a single display.
- Rendering: Supports swap chain rendering and multiple layers of composition, using the rendering engine provided when the Session was created (e.g., Vulkan or OpenGL).
- API Layers: Hooks that can be inserted between the application and API (e.g., for logging, debugging, and validation).
The following diagram from the OpenXR Reference Guide shows a programmatic view of these constructs and concepts during the lifecycle of an AR app:
Click Image to Enlarge
Source: Khronos Group. https://www.khronos.org/files/openxr-10-reference-guide.pdf
The purple region on the left, shows that the app uses the API to query for available extensions and layers, and an Instance is created to communicate with the OpenXR runtime and open a Session.
The blue region in the middle shows the main functionality that takes place while the Session is alive. During initialization, a System is requested (e.g., a handheld and/or headworn device), and the app queries for available rendering blend modes, View Configuration, and Interaction Profiles that are subsequently used to create the Session. Once the session is up and running, Spaces are queried and created, Actions are configured, and swap chains are set up for rendering.
The Session then begins its main loop, as shown in the light green region on the right. The dark green box on the far right shows a drill down into the various APIs invoked during the main loop. Similar to a typical game loop, the main phases involve gathering input (Actions), performing updates, and rendering the final frames accordingly.
During this loop, the app watches for idle Sessions (e.g., when the app becomes unfocused or invisible) and is expected to reduce or avoid performing resource-intensive tasks like gathering input and rendering. The app also watches for events indicating that the Session is to end (e.g., the user wants to close the app), at which point the Session is destroyed.
For additional information about OpenXR, check out Khronos Group’s OpenXR Specification.
How Snapdragon Spaces Works with the SDKs and at Runtime
Our Snapdragon Spaces SDK is a provider for Unity and Unreal’s OpenXR plugins and is thus a required dependency. The OpenXR plugins in those frameworks establish base functionality like rendering and head tracking, and provide interfaces to talk to the OpenXR runtime. In Unity, AR Foundation’s manager components tap into the Snapdragon Spaces subsystems (e.g., anchors, plane detection, image detection, etc.), while in Unreal, there is no further dependency. Once configured, this allows you to work in a content-driven framework without having to worry about specific OpenXR API calls.
At runtime (i.e., when your app runs on the user’s device), the following lifecycle takes place:
- The user installs one or more OpenXR services on their device (e.g., Snapdragon Spaces Services, Oculus OpenXR Services, etc.). Note that Snapdragon Spaces Services (available to users through the Google Play Store) must be installed for Snapdragon Spaces apps to work.
- The app requests the desired OpenXR service runtime to communicate with. Snapdragon Spaces apps specifically request Snapdragon Spaces Services.
- The app asks for an OpenXR Instance that has some specific set of features supported by that OpenXR runtime.
- An Instance is created to connect to the service, and a pointer is returned to the app.
- The app then uses that pointer to start up the Instance and communicate with it throughout the lifecycle of the app.
Rendering is handled by the game engine’s base OpenXR layer, as is the mapping of an Interaction Profile to the app’s input code. The app can then ask for instances to additional features, like controller support, hand tracking, etc.
Unity and Unreal, in conjunction with the Snapdragon Spaces SDK, both generate a game binary that runs on the target device. Those binaries include the code needed to work with Snapdragon Spaces Services, so you don’t have to worry about invoking specific OpenXR API calls.
Try out Snapdragon Spaces Today!
Snapdragon Spaces gives you what you need to work with OpenXR-compatible devices – both at design and runtime – allowing you to work at a higher level than directly with the OpenXR API. However, it’s always good to understand the implementation details so you know how things work behind the scenes.
When you’re ready to develop your next headworn AR app, head to the Snapdragon Spaces Developer Portal and check out our documentation, then register to download the Snapdragon Spaces SDK for Unity or Unreal. And if you don’t have Unity or Unreal, both offer free tiers that you can download and set up right away. Once installed, just follow our Unity Setup Guide or Unreal Engine Setup Guide to integrate the respective version of our Snapdragon Spaces SDK. Then grab yourself some OpenXR-compatible hardware like our Snapdragon Spaces Hardware Development Kit and make headworn AR a reality!
Be sure to also subscribe to our Snapdragon Spaces newsletter to stay up to date on the latest news and updates with our platform.
For additional information about AR development, be sure to check out the following resources:
- OpenXR for Snapdragon Spaces
- The Constructs of Augmented Reality - A Developer's Guide- eBook
- Building Blocks and Skills to Break into AR Development
- An Ultra Leap into a Whole new World of Hand Tracking
- Creating a new Reality with Spatial Computing
Snapdragon Spaces is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.