Adobe’s Project SonicScape lets VR film editors see sound

Adobe is hosting its MAX conference this week — and, as usual, it’s using the event to show off some of its latest experiments. Indeed, Adobe even has a special keynote just to show off its craziest demos, but the company apparently has so much to show that it decided to demo one prototype early: Project SonicScape.

The idea here is simple: How can VR/AR editors best edit their sound in this 3D environment? The solution is as simple as it is ingenious: you simply visualize the sound right in the 3D space. “Project SonicScape takes the guesswork out of the immersive content editing experience by visualizing where the audio is, its frequency and intensity, and therefore making it that much easier to bring immersive content to life,” an Adobe spokesperson told us.

Given that this is very much a visual project, your time is probably best spent just looking at Adobe’s video explainer below. It’s worth noting, though, that the company has made some heavy investments in 360 video editing. Among other things, it acquired Mettle’s SkyBox tools earlier this year and made Mettle co-founder Chris Bobotis its “Director of Immersive.”