Snapdragon and Qualcomm branded products are products of
Qualcomm Technologies, Inc. and/or its subsidiaries.
Given the amazing capabilities and advances in today’s smartphones, have you ever wondered what the next big evolution of these devices would look like? hubraum, a tech incubator run by Deutsche Telekom, recently teamed up with Qualcomm Technologies, Inc. (QTI), Nreal, Unity, and others to answer this through their Mixed Reality Demo Day event. The key theme throughout the event was the concept of spatial Internet or the spatialization of the mobile Internet, the next evolution of in computing. In this next evolution, users will transition from looking down at the Internet through 2D screens to being immersed in the Internet via extended reality (XR) headsets. Through this immersive medium, the screen disappears and the world becomes your desktop.
This concept, also referred to as spatial computing, is made possible by the convergence of three key technologies: XR, on-device AI, and 5G. Unity’s statement at the event elegantly sums up the significance of this: “We’re moving towards a future where we’ll pull our phones out less frequently because our AR glasses will do more for us. We predict that spatial computing will be at the forefront of how we live and work in the future.”
Let’s look at how XR and spatial computing are poised to revolutionize how we access the Internet. Then we’ll explore some technologies from QTI that can help you build immersive solutions.
Several factors are driving spatial computing. Today, remote meetings have taken center stage as the pandemic forced us to find alternate methods of collaboration. While virtual online meetings are great, our growing and continued reliance on them has created the desire to replicate the close and personal interactions of in-person congregations.
On the device side, powerful mobile processors now make it possible to pack XR experiences into smaller devices than ever before (e.g., the Nreal Light AR glasses, Lenovo ThinkReality A3, etc.). Other features like eye tracking and hand tracking can control the eye and hand movements on virtual avatars, or control positions and interactions with spatial AR user interfaces. Meanwhile, advances in 3D audio and 3D user interface design round out the immersive experience. Finally, high-bandwidth, low-latency communications over 5G provide ultra-fast cloud connectivity to drive these experiences.
Many consider XR to be THE killer app for 5G, given its need for high throughput with immersive 3D content and low motion-to-photon latency for responsive interactions and reduced cybersickness. The demands of XR far exceed those for normal streaming apps (e.g., movie streaming) due to the processing load for high resolution graphics in XR that involve perception transmission, processing, rendering, encoding, warping, etc. And this must all happen in near real-time. Numerous examples at hubraum’s event showed how developers fulfill these requirements and our vision of Boundless XR. Below are a few notable uses cases:
- Holographic avatars: with the need for more personal, realistic, and collaborative remote meetings, virtual avatars are now being used to represent meeting participants and even personal assistants. These technologies can capture aspects of unspoken communication such as facial expressions and micro gestures. At the same time, voice-controlled AI-driven assistants with similar realistic features demonstrate where smart audio solutions may be headed (e.g., evolving from voice assistant speakers to full-blown holographic virtual assistants). In addition to avatars, virtual holographic objects are also possible. For example, participants can remotely collaborate on a new product design by viewing it as a virtual object from different angles in a meeting.
- e-Commerce: interactive and immersive digital storefronts are now possible thanks to XR. Users can look at products through AR glasses to obtain more information or examine virtual products from different angles.
- Personal object recognition: AR glasses with image capture and recognition technology can record real objects and later identify them when they enter the user’s field of view. For example, a user can record what a fastener of a specific size and thread count looks like to identify them quickly when assembling products.
- Immersive books: textbooks and kids' storybooks can virtually come to life as characters and scenes immerse readers in AR.
- Enhanced TV: TV viewed through AR glasses now offers the capability to display additional information along with the TV program. For example, additional stats about a sporting match can be displayed on each side of the TV in AR.
- Live Gaming: remote players can compete against each other in real-life games like basketball while wearing AR glasses. In conjunction with computer vision technology, the glasses are designed to automatically detect when each player scores and display the game’s stats.
- Immersive Location Tracking: AR, in conjunction with GPS and other sensors, can display virtual menus for locations of interest (e.g., restaurants, etc.) which fall within the user’s field of view. The user can then interact with these menus to gain more information such as directions, ratings, etc.
XR Tools for Developers
QTI is no stranger to XR, and we have over a decade of XR research stretching back to 2009.
Since then, we’ve launched several XR platforms and XR-specific Development Kits like theSnapdragon® XR2 HMD Reference Design for the Snapdragon XR2 5G Platform, which helps power XR experiences in products like the Oculus Quest 2. Developers can also build lightweight and power-efficient AR glasses tethered to 5G smartphones over USB-C, such as those powered by our flagship Snapdragon 888 5G Mobile Platform.
On the content creation side, it’s no secret that we have been collaborating with Unity for several years to drive mobile game and XR development. Unity is now pushing the boundaries of XR for other verticals as well. As content remains a huge aspect of XR development, content creators should check out the Unity MARS XR framework.
Finally, developers building XR applications for Snapdragon platforms can take advantage of our rich tools and SDKs, including:
- Qualcomm® 3D Audio Tools: tools to record, edit, and produce spatial audio.
- 3D Audio Plugin for Unity: binaural spatial audio plugin for Unity.
- Snapdragon Power Optimization SDK: provides a rich API for balancing power versus performance.
- Snapdragon Profiler: allows developers to analyze CPU, GPU, DSP, memory, power, thermal, and network data.
We’re entering an age where the Internet is no longer just a resource viewed through 2D screens. Spatial compute technologies are paving the way for immersive interactions with the Internet, accessible anywhere and experienced in new ways. Stay tuned, as we’re constantly evolving our XR technologies and are planning some exciting XR announcements soon.
For additional information about XR and spatial computing, check out Snapdragon XR1 Platform Makes It Real for Developers, Tips for Enhancing XR Experiences, and How To Guide: Developing for Immersive Realities. Also, be sure to check out the Nreal Light Developer Kit and the Oculus Developer site.
If you have an interesting spatial computing/XR project that uses technologies from QTI, let us know about it, and we may feature it on QDN!
Snapdragon and Qualcomm 3D Audio are products of Qualcomm Technologies, Inc. and/or its subsidiaries.