Immersive Sound In VR

Hi everybody, my name is Linus and I work as Audio Director at Resolution Games. More practically speaking it means that I am responsible for all the music and sounds in our games and trailers. I make the sound design, music composition and audio implementation in Unity, from an early draft on the piano to a fully implemented orchestrated score in-game. I have been with Resolution since the beginning and even worked on Solitaire Jester and Bait!. A lot has happened in the VR industry since 2015.

immersive sound 1

Previously, I worked in game and film audio but was surprised that VR entailed so many unique challenges. These included how to make a VR game since many of the old tech game conventions such as locomotion (moving around in a game), building UI, working with animations and cut scenes were suddenly obsolete for this medium.

I definitely had the feeling, and so did my colleagues, that we were on the forefront of a new frontier. We were game makers/settlers in a new world without conventions. That first year was a renaissance of rebirth of everything we thought we knew about making games. From my perspective as a sound designer and a composer, but also as a film music student, the discussions reminded me somewhat of the history of the early days of the film industry. Thoughts such as, “if there is music there must be a radio, a stereo, or a band visually in the picture to make sense. How would the viewer otherwise know where the music is coming from?” And indeed in Solitaire Jester we have a radio with positioned audio playing the music diegetically “in the world”. That sort of thinking colored our early experimentation in the VR space. We believed that all sounds needed a visual anchor to make sense.

The main difference between an ordinary game and a VR game is that you are actually there in the world in person. You aren’t looking at a screen of a first or third person view in the world, so old tricks suddenly became not good enough. We then moved on to our second game, Bait!. This title demanded different soundscapes depending on the environment. I quickly discovered that the basic ambience in the bottom layer, what we usually call a room tone, was not feeling great when using stereo. It is not uncommon to use a spatially static room tone both in stereo or surround in flat screen games and movies. After some experimentation I found a neat but powerful solution - separate a stereo ambience into two mono tracks, have them positioned to the corresponding left and right side of the camera/head, static but not bound to the camera. Therefore when you rotate your head, the ambiences stay in play, but always at the same distance to the head. I also always make the ambiences not play entirely in 3D but instead have a few percentages in 2D.

There are usually a few other positioned sounds in the world, like a bird gawking, sound of leaves in a nearby tree, a dynamic wind or the waves from the sea. This approach allows the ambiences and your game environment to feel alive and trigger an ever so slight pressure change on your eardrum - mimicking how our ears experience the real world when we turn our head. Of course, some other games use this approach but the immersive impact and sense of presence in VR can be breathtaking for a first time user and are essential for building great experiences.

immersive sound 5

I use a similar but different approach to music. In flat screen games or film, the music is static in stereo or surround similar to ambiences. When using non diegetic music in VR games (music not part of the fictional world), I place the stereo audio source/emitter a few meters in front of the player and usually a few meters up in the air. Then I use a similar trick of putting the audio source/emitter in say 70% 2D and 30% 3D. Voila, that creates non-diegetic music that still creates a sense of presence and immersiveness. It also still packs a punch as a 2D mix, but also creates that slight pressure change on the ear drum that heightens the sense of being in the world. In the beginning of my time at Resolution, I tried to also use the same approach with the music as with the ambiences. I would break down the stereo file to mono and have them positioned to each side of the head with overlapping attenuation curves, but it really did not pan out that well and the music lost some weight and punch in the game.






Those are some of the lessons I learned when I first started implementing sound into VR games. I’ll be back soon to l talk about animation or the lack of in VR games, the challenges of player real time trigger movements, velocity sensitive impact sounds, and speed triggers whooshes.

If you’d like to listen to some of the songs I’ve created for our titles you can find them on 
Spotify or on your favorite streaming platform. If you’d like to work with me, you can find our available jobs here.

Previous
Previous

Redesigning The Blaston Tournament Interface - A UX/UI Case Study

Next
Next

How We Grounded Angry Birds AR: Isle of Pigs in Your Living Room (Part 2)