With all the work you’ve put into making your extended reality (XR) projects in Unity as immersive as you can, don’t you want audio that keeps up? After all, your users’ eyes are basking in six degrees of freedom (6DoF); shouldn’t their ears get the same thrill? The game looks different when they turn their head or move their eyes, and it should sound different, too.
We’ve just released the Qualcomm® 3D Audio Plugin for Unity, a binaural, spatial audio plugin you can use in creating XR (including AR and VR) experiences and games. The plugin makes it as easy as possible for you to add lifelike audio to lifelike video playing in VR headsets, so that sound objects follow your users the way they do in the real world. And, if those headsets are powered by the Qualcomm Snapdragon™ 835 or Snapdragon 845 mobile processor, you can offload work from the CPU to the DSP for battery savings.
How can you build cooler XR with the Qualcomm 3D Audio Plugin for Unity?
Whether you’re starting a new XR project or you want to convert hundreds of monaural sounds to spatial ones in an existing project, the plugin makes its audio features available to you right in the Unity interface.
- Up to 64 simultaneous, spatialized sound objects - Suppose you’re taking your user through a jungle. Insects chirp, birds and animals call, leaves rustle and vegetation crunches underfoot. Each of these elements in your XR experience can have its own set of different positions and sounds.
- Two simultaneous, Ambisonic soundfields - Ambisonics is a way of using first-order spherical harmonics to encode and decode any number of sounds in such a way that the entire soundscape appears to follow your head around, not only from side to side but also up and down and front to back. In the jungle example, the plugin provides Ambisonic soundfields that make for a realistic, surrounding ambience with the spatialized sound objects you’ve added. In general, it takes less performance and less development time to create realistic ambiences from Ambisonics than from many separate spatialized sound objects.
- High-quality sound and low motion-to-sound latency - Any delay between what users see and what they hear is the bane of XR developers. The plugin is designed to reduce latency between audio and video. It also offers direct rendering designed to deliver a low signal-to-noise ratio and lifelike directionality.
- Customizable shoebox reverb - Sounds reverberate differently off different surfaces (wood, stone, metal). The plugin lets you allow for changes in echo and sound reflection when moving, say, from a narrow, concrete tunnel into a wide-open landscape.
- Support for digital signal processor (DSP) - On devices running the Snapdragon 835 or Snapdragon 845 mobile platform, you have the option of offloading reverb to the DSP to reduce power consumption and save CPU cycles for other operations.
Those features add up to a credible audio experience that matches the visual experience in your XR games and apps.
Developing with the 3D Audio Plugin
The plugin works with Unity versions 2017.1.0f3 and 2017.2.0f3 on Android, Windows 7 (32-bit or 64-bit) and Windows 10 (64-bit).
A big part of Unity’s success, of course, has been its model of allowing you to develop your game once using a Windows, macOS or Linux workstation, then to deploy for platforms as varied as PS4, Xbox 360, Android and more. In keeping with that model, you can maintain a single Unity project when you use the 3D Audio Plugin; there’s no need to maintain separate versions of your project, with and without the plugin. On unsupported platforms you can disable the plugin; it consumes a negligible amount of resources when disabled.
In a large project with hundreds of sounds, adding features like spatialized audio, reverb and Ambisonic soundfields would require a lot of manual conversion work. You could automate that work, but automation would entail yet another development effort. The 3D Audio Plugin converts any number of audio sources in an existing project, even if the project was not intended for use in VR or with spatialized audio. You use the plugin to create a Q3DAudioGlobalSettings object, which can automatically detect audio sources that have mono audio clips, then convert them to 3D Audio sound objects.
At build-time you have the option of running on the Arm CPU or offloading to the DSP. That means you don’t need to take our word for the battery savings from the DSP; you can build your project both ways and decide for yourself where it runs better.
Have you started developing with the Qualcomm 3D Audio Tools suite? If so, you’ll be glad to know that we’ve created the plugin in conjunction with version 2.0 of that suite, and that we align our audio software releases to VR headset and chipset releases.
The way you design sound in your projects is an important part of making credible, lifelike, 6DoF experiences and games, so spatialized audio can be a big differentiator in your XR development.