Google launches Resonance Audio, its new spatial audio SDK

Next Story

Album+ organizes photos with AI that runs on your phone, not in the cloud

As augmented reality slowly proliferates with the promise of bringing computer interaction into three-dimensional space, platform giants like Google are tasked with bringing every sense into 3D space, as well.

Today, Google is taking some of the tech from its VR Audio SDK and building it into a more comprehensive spatial audio product called Resonance Audio that works across mobile and desktop platforms.

At its core, Google wants to use the SDK to replicate “how real sound waves interact with human ears and with the environment.” How it does this is through accounting for how physical objects and environments distort the sound we hear in real life and replicating those in virtual scenarios.

If you’re a virtual character walking around holding a boombox, how does the sound differ when you’re walking through an open field while playing some tunes versus while you’re walking down a stairwell? What happens when an object comes between you and the source of the sound? These are the scenarios Resonance Audio attempts to address, empowering users to go as in-depth as they desire in modeling these scenarios.

Resonance allows developers to not only specify the sources of sound within a scene but also shift how that audio moves directionally so you’re not hearing the same feedback when you walk back behind a digital character as you would when you’re right in front of their face.

As game developers know, situations like the above may be simple to grasp but grow much more complicated when you’re dealing with dozens of these audio interactions taking place simultaneously. Because CPU resources are often being devoted to visuals, these complications can create a lot of unwanted difficulties that could lead to products shipping with the most basic audio. Resonance aims to fix that with some tricks that include stuff like pre-rendering how certain sounds reverberate in different environments so these interactions aren’t left being rendered on the fly.

Resonance works with game engines like Unity and Unreal and has plug-ins available for a number of other editing suites, so it should fit in rather snugly with existing workflows.

Google slowly seems to be using its interest in VR and AR foundational tech to build tools with wide application to traditional game development. Last week, Google showed off Poly, a home for 3D assets and environments. Resonance Audio delivers a spatial sound SDK that promises to simplify how developers craft how we listen.