New techniques for capturing and delivering audio will be part of even more immersive experiences, writes Amelia Kallman.

Remember the last time you were at a concert, having paid a week’s wages to experience your favourite music live?

While the performers may have been directly in front of you, chances are you heard the vocals and instruments mixed and coming at you from a stack of speakers on either side of the stage, leaving a disconnect between you and the artists.

Now, innovations in audio technology are working to close this gap between location, what we see and what we hear.

Amelia kallman

Amelia Kallman

From 360-degree film and virtual reality, to live events, home cinemas and headphones, we are witnessing the start of an audio revolution that is changing the future of how we hear and listen.

With the quality of visual and interactive aspects of virtual reality and 360-degree film improving, realistic 3D soundscapes are quickly becoming a new requirement.

While the eyes can only see up to 200 degrees at a time, the ears listen in 3D, distinguishing sounds directionally from all angles.

By synchronising the viewing and audio experiences in a spacialised way, the virtual environments become that much more immersive. 3D audio allows sound designers and composers create fully immersive environments by placing and moving sound anywhere around a 360 sphere.

Google is helping to democratise development in this area, having recently released a new audio software dev kit, Resonance Audio specifically to make 3D audio design for 360, VR and AR easier to achieve. Traditionally a difficult thing to render in real-time, Google says Resonance Audio is lightweight enough to deliver high-fidelity 3D audio even on mobile phones.

”All art is about escapism and new technologies are changing our perception of time and space” - Roman Rappak 

The secret is ambisonics, a full-sphere surround sound technique that is combined with ‘head-related transfer functions’ (HTRFs), a filter that maps incoming sounds in relation to ear position.

Replicating the way our brain perceives sounds in space means you can program distance, height and orientation so volume, and left and right ear variations, change depending on movement.

Similar simulations are now available to real world environments as well. French manufacturer L-Acoustics has been specialising in the design and manufacture of professional sound systems since 1984. Today their aim is ‘immersive hyperrealism’. Their multichannel technical solution, L-ISA, is purpose built to link cutting-edge sound technology with immersive art, and offers engineers an entire ecosystem of tools to manage environments in real time.

L-ISA’s latest product, Island, brings professional grade acoustics to home cinemas and private environments. An oval piece of furniture with a large bed-like lounge in the centre, the structure rotates to face any direction.

Hidden within the elegantly designed casing is a 24 or 48-channel audio platform that envelops listeners and ‘goes beyond what a live concert can achieve’.

‘Ocean’ is its integrated sound solution for rooms dedicated to entertainment. One of the most stunning features about Ocean is that the installation is practically invisible and completely unobtrusive. The speakers can be recessed and colour-matched into the walls with no visible cabling.

Nuraphone is an earphones start-up that personalises listening. Every person hears differently, so instead of a one-size fits all headphone, Nuraphone uses a self-learning engine to measure your hearing and tune music and audio to optimise it to a person’s ear.

It also includes haptic technology to feel the music like you would in a live concert. It does this by splitting the melodic sounds to an in-ear speaker, and the bass sounds to an over-ear tactile speaker that delivers the sound through your skin as if you were hearing it live.

These innovations in audio technology provide new opportunities for artists. The Miro Shot collective, the world’s first open source band, incorporates 3D audio into their multi-sensory, cross-reality live shows. Veejayed in real-time, audiences are transported from seeing the band live in a multimedia rock concert, to new worlds in virtual, mixed and sensory realities.

Lead singer Roman Rappak believes the impact of new technologies on music will mean experiences become more interactive.

“All art is about escapism and new technologies are changing our perception of time and space. We can exist in multiple contexts, across different timelines. Our show is an attempt at understanding this.”

Artist and musician Warsnare has just debuted his live show ‘Warchestra’ at the Albany Theatre.

The 360-degree surround sound and visual performance was in collaboration with audio specialists Call & Response and funded by the Arts Council England. The show used 29 loudspeakers arranged in a dome over a 10 meter diameter, allowing sounds to be placed and moved around the sphere, adding a whole new compositional layer to his artistic expression.

Whether an artist, engineer, or just your average listener, one thing is for certain: The future is about to sound better than ever before.

Amelia Kallman is a futurist, speaker, journalist, author, nightclub owner and director of the first virtual reality burlesque show. Read more of her IBC365 articles here and here.