Why Snapdragon Spaces is a big deal for AR Development

Tuesday 11/8/22 08:23am
|
Posted By Brian Vogelsang
  • Up1
  • Down0

Snapdragon and Qualcomm branded products are products of
Qualcomm Technologies, Inc. and/or its subsidiaries.

We believe that lightweight headworn AR glasses are poised to become the next evolution of the smartphone. They offer even more immersive experiences than 2D screens and will transition users from looking down at their phones, back to looking around at their surroundings, while retaining that digital view.

In The Architecture of Snapdragon Spaces, we covered how our Snapdragon Spaces XR Developer Platform was designed to facilitate the development of these headworn AR experiences. But it’s not until you get into the small details that you realize why Snapdragon Spaces is a BIG deal for developers.

To start with, Snapdragon Spaces is an end-to-end platform with a rich HDK and SDK for building AR experiences on compatible headworn displays powered by Android. It works with our Snapdragon Spaces Services APK, which provides the AR runtime on Android devices. We’ve done the hard work of putting together these resources so you can focus on starting your development.

Spaces Services APK, which provides the AR runtime on Android devices. We’ve done the hard work of putting together these resources so you can focus on starting your development.

The SDK is available for Unity and Unreal Engine – arguably two of today’s most popular game engines. These tools combine content-driven workflow with developer-oriented programming capabilities. If you’re coming from game development and other multimedia backgrounds, you might already have Unity or Unreal experience. If not, you can probably get up to speed quickly. Either way, you’ll be working in modern, proven frameworks, with the confidence that their capabilities can deliver high-performance AR apps.

The tools are also OpenXR compliant, which means that API calls should work across compatible devices with minimal porting. Unity and Unreal’s OpenXR plugins establish base functionality like rendering and head tracking, and also provide interfaces to communicate with OpenXR runtimes including the subsystems provided by Snapdragon Spaces. This means you can build experiences that target OpenXR-compliant devices, while hardware manufacturers can build compatible devices with capabilities that fulfill the OpenXR specification.

Let’s look at some of the key AR features that Snapdragon Spaces supports, why they are important for building AR experiences, and where to find information on integrating them.

Positional Tracking
Arguably the most fundamental and important feature for any AR application is positional tracking. It precisely maps the environment and estimates the position and orientation of the user’s viewing device within a 3D space. Snapdragon Spaces captures this as 6DoF information as input from headworn AR devices. Developers use this to understand and track the end-user’s position relative to the world and render AR content in the scene relative to the end user’s head position and orientation.

Check out our Unity and Unreal examples which show how to set up 6DoF tracking for headworn AR devices.

Hand Tracking
Hand tracking tracks the position and orientation of a user’s hands and finger joints in 3D space. It can be used as input data to manipulate digital objects, interact with 3D GUIs, or animate digital representations of the user (e.g., realistic avatars, on-screen hands, etc.).

This provides a whole new level of input and feedback, not possible with traditional devices that rely on screen gestures or physical controls. Users can now see virtual representations of their limbs that update and behave like their real-world counterparts as they interact with objects. Capturing users’ movements and mannerisms can also bring a sense of presence to your AR experiences.

Our Snapdragon Spaces Unity package provides an AR Foundation-like interface. Hand tracking is performed using ML-based models and the positional tracking cameras. Check out our Unity hand tracking example and Unreal hand tracking example to see how it all works.

Additional hand tracking features for Unity (e.g., rigged hand meshing and interaction components) are available in our QCHT Unity Core package, which is part of the Unity SDK download and comes in a folder called Hand Tracking inside the folder called Unity Package. Check out the interaction methods, interaction components, and extended hand tracking sample for examples of distal and proximal interactions.

Image Recognition and Tracking Image recognition identifies visual features or markers captured by a camera. Image recognition and tracking are often used to trigger and display digital content in relation to real-world objects (e.g., display an information popup when the user looks at a product label). On headworn AR displays, this can provide a world of information as the user scans their surroundings.

Image Targets are images stored in a database as markers that can be recognized by the tracking system to identify flat regions in the world for the app to augment or otherwise act upon recognition of each specified image.

Check out our Unity and Unreal examples to see how to set this up.

Plane Detection

Plane detection is a form of spatial mapping that detects flat surface regions to define boundaries (e.g., walls, tabletops, etc.). This foundational feature paves the way for advanced constructs down the road such as digital twins (3D mesh representations of physical environment or objects).

The Snapdragon Spaces SDK Hit Testing feature also allows for ray casting to identify interactions with geometry found in the scanned area, by bouncing off detected surfaces.

Check out our Unity and Unreal examples on how to detect horizontal and vertical planes as well as convex hulls.

Local Anchors
An anchor is metadata used to position, track, and persist digital content. Anchors offer the ability to attach an anchor (i.e., lock or pin) to digital assets in space, associating them with real-world clusters of recognized geometry. Local anchors are currently restricted to on-device local storage within the instance of an app, and by default, will be cleared as soon as you close your app.

Local anchor information can now be stored in a local save file, allowing for the recall of the anchors upon opening the scene again. This creates the illusion that objects live in the environment across time. Not surprisingly, anchors depend on accurate positional tracking to enable their placement and orientation.

Check out our Unity and Unreal examples to see how to create and place local anchors along with the recently added persistence features.

Learn more about Snapdragon Spaces
Head to the Snapdragon Spaces Developer Portal and check out the documentation, and register to download the Snapdragon Spaces SDK for Unity or Unreal. You’ll also need compatible hardware like our Snapdragon Spaces Hardware Development Kit.

If you don’t have Unity or Unreal, both offer free tiers that you can download to explore. Once installed, follow our Unity Setup Guide or Unreal Engine Setup Guide to integrate our SDK.

For additional information, be sure to check out the following resources:


Snapdragon Spaces and Snapdragon Spaces XR Developer Platform are products of Qualcomm Technologies, Inc. and/or its subsidiaries.