Snapdragon and Qualcomm branded products are products of
Qualcomm Technologies, Inc. and/or its subsidiaries.
Augmented World Expo (AWE), an annual conference showcasing the latest AR, VR, and wearable technology, recently wrapped up its 2023 US event in Santa Clara, California.
At this year’s event, their biggest yet, we could see and feel the excitement for the growth of XR. For developers, two key themes stood out:
- Natural Interactions: There is a drive towards capturing natural and familiar inputs like gestures, so you can build spatial interactions which are easier for the users to adapt to and learn.
- Slimmer/lighter Devices: We are seeing smaller, ergonomic, and more power-efficient headworn devices so you can design longer in-app experiences and conquer new use cases and operating environments.
The Snapdragon Spaces team was out in full force again with a booth, demos, several talks, a Build-an-App workshop, and a couple of exciting announcements.
At last year’s AWE, we launched our Snapdragon Spaces™ XR Developer Platform. This year we announced a major new feature – Snapdragon Spaces Dual Render Fusion. To kick off the announcement, we had two dynamic talks outlining the evolution of Snapdragon Spaces, which very much aligned with the above two themes.
Let’s take a closer look at those talks and what they mean for XR developers.
Accelerating the XR Ecosystem: The Future is Open
Hugo Swart, VP & GM of XR and Metaverse at Qualcomm, keynoted Accelerating the XR Ecosystem: The Future is Open. There he gave an update on the XR ecosystem for developers and made the official announcement of Dual Render Fusion.
Hugo shared Qualcomm’s visions of openness, where the metaverse is accessible to all XR developers to build apps and content that work across devices and realities. With Snapdragon Spaces, the goal is to build it once and have that content accessible and interoperable across platforms. And more importantly, the future is open for all developers to define XR.
Hugo then focused on the progress towards all-in-one devices – small, sleek, yet immersive headworn glasses that incorporate compute, power, and display in the same unit.
To achieve even smaller form factors with minimal power consumption, Hugo noted that heavier processing must be distributed – to compute devices in proximity of the user and to the cloud. The Snapdragon Spaces HDK and the Sightful AR laptop are examples of XR devices capable of distributed processing.
Hugo shared Qualcomm’s increasing financial and technical support for the XR developer ecosystem. Over 80 companies are now in the Snapdragon Spaces Pathfinder Program, and Qualcomm’s $100m investment in the Snapdragon Metaverse Fund has three new companies. In addition, several operators are now embracing Snapdragon Spaces, like KDDI, that use it for virtual art exhibits at a museum in Japan:
The highlight of Hugo’s talk was announcing Dual Render Fusion. Dual Render Fusion allows a Snapdragon Spaces app to render to a smartphone and a connected headworn display simultaneously. The smartphone can be used as a physical controller (e.g., to render user interfaces and conventional 3D graphics), and the headworn XR display provides a spatial XR view.
The presentation was further elevated when Stephan Hamberger, Vice President of Digital Products at Red Bull, came on stage. Stephan discussed how Red Bull has collaborated with Qualcomm to create new ways for viewers to experience erzbergrodeo, an event in the FIM Hard Enduro World Championship. With Snapdragon Spaces and Dual Render fusion, the audience’s experience is transformed from passive viewing to active participation.
And to satisfy those hungry for more, Kalpana Berman, Chief Product Officer at Kittch, discussed how their collaboration with Qualcomm has allowed them to create hands-free cooking tutorials using Snapdragon Spaces and Dual Render Fusion. Now users can follow recipes, watch tutorials, and set timers in AR, while their hands remain free for the task at hand.
In addition, mixed.world demonstrated Virtual Places in our AWE 2023 booth, and showcased how Dual Render Fusion enhances conventional 2D map navigation with a 3D spatial view, as shown in the image below.
AR’s Inflection Point
Steve Lukas, Director of Product Management, XR at Qualcomm, presented AR’s Inflection Point, diving deeper into Dual Fusion Render development.
Replacing mobile phones with all-in-one AR glasses requires that developers meet users where those are today: on their smartphones. Dual Render Fusion helps developers introduce users to spatial concepts while taking full advantage of existing smartphone interactions.
Dual Render Fusion was applauded by several early adopters, including Adam Sanche, Senior Developer at Scope AR, who said, “All the familiar interactions that you would have on your phone are still there, and you get the added benefit of being able to see 3D content or whatever you want, in the context of your actual surrounding space.”
Inga Petryaevskaya, CEO and Founder of ShapesXR, stated: “The fact that I use the phone that I use, like this, all the time makes it so clear and natural for me. Now I can see how consumer adoption can happen.”
Layer XR onto your Mobile App Today!
Ready to start developing with Snapdragon Spaces and Dual Render Fusion? Get started in three easy steps:
- Create a new account on the Snapdragon Spaces Developer Portal – it’s quick to sign up and free.
- Head over to the Snapdragon Spaces documentation and Download the latest Snapdragon Spaces packages.
- Check out the Dual Render Fusion topic to learn more about setup.
Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.