Capturing and crafting immersive content requires specialist tools. Dick Hobbs looks at some of the latest technology for creating VR, AR and 360-degree video. 

VR – virtual reality – and AR – augmented reality – tend to get lumped together, even though they are very different things.

AR can be a confusing term, because in broadcast and related media we tend to talk about it as a means of augmenting video, like having the state of the polls pop up from a desk in an election night special, or players appearing on a virtual pitch in a sports studio.

But AR is also used to refer to a digital overlay on a live picture to offer guidance or more information – step by step turns as you walk around an airport, for example.

NAB hosted a dedicated area on the show floor for the latest developments in VR, and there were a lot of new offerings in 360-degree cameras.

Samsung VR camera

Samsung VR camera

Kandao had a platform based on six cameras. Variations were available for high resolution (8K) and high frame rate (up to 120 fps). Set up was minimised by using a single ethernet port for both signal and power.

Samsung has a range of VR cameras, including a sophisticated new device with 17 cameras: eight pairs plus one on the top pointing vertically upward. The creative tools are designed to complement its Gear headset.

The Yi Halo is quite a substantial device; the Vuze+ from Human Eyes is a very neat package while delivering competent 3D surround performance.

Yi Halo VR camera

Yi Halo VR camera

In all cases, though, there was demonstration material available on the booth but little clear idea of the direction that 360-degree video is likely to take. Most stands talked about training and tourism as likely applications. A couple specifically mentioned firefighter training, along with medical applications.

On one booth I met a producer/director with real experience in creating content. John Iverson of VR Productions was honest and open about prospects.

“In large part it’s going to be educational,” he said.

“We are already seeing it used for PTSD [post-traumatic stress disorder] patients and autistic children. For the latter, it helps teach them day to day tasks like laundry and using an ATM.

Human Eyes Vuze VR camera

Human Eyes Vuze VR camera

“But I think the most significant applications in the long term are the places where you can have an impact,” he continued. “Colorado State University has done a project to train a paraplegic’s brain to control an exoskeleton, giving movement where there was none.

“If you are going to use VR, you have to create situations which need to be immersive. If you can’t do that, then do it on film,” was his conclusion.

Miguel Angel Doncel of post specialists SGO said his company had launched Mistika VR as a top-end stitching and finishing tool for virtual reality last year, but they already have more than 1000 active licences.

“VR is an amazing community,” he said. “There are a lot of projects in other markets – a lot of things that cannot be done with other formats.”

Before leaving VR, a word about audio. One of the surprises as I walked the NAB halls was discovering that the Nokia Ozo – famously discontinued barely a year after establishing itself as a market leader – is still going strong. Or at least its audio development team is still active, developing immersive audio applications still under the Ozo brand.

“Audio is critical for immersive video,” Emmanuelle Garnaud-Gamache of French research body B<>com said. “At the moment the quality of VR video is not good – but the sound is worse.”

“At the moment the quality of VR video is not good – but the sound is worse” – Emmanuelle Garnaud-Gamache, B<>com

Drawing on work on object-based audio as part of the EU Orpheus project, B<>com has developed a plug-in for audio post tools, giving sound designers a high order ambisonics panner to accurately place individual sound sources. It was demonstrating a remix of an existing soundtrack, which created a very convincing dynamic 3D sound field, accurately tracking the field of view and adding to the visuals.

Winner of two IBC Awards last year was Groupe Média TFO in Toronto for its pioneering work combining the Unreal graphics engine from Epic Games with broadcast applications. This seems to be a common choice of platform for today’s augmented reality solutions.

Brainstorm’s InfinitySet 3 had recently been showcased around the world, as it was used to create a dynamic night sky for the opening ceremony of the Pyeongchang Olympics. These hyper-realistic images had to be generated in real time in 4k to match the action on the night.

Brainstorm augments the night sky in Pyeongchang

Brainstorm augments the night sky in Pyeongchang

Just as Brainstorm has embraced the Unreal engine, so too has Ross Video, which extended its augmented reality capabilities, not least to include realistic reflections of live video in the virtual elements.

Avid is now moving into this market, and was demonstrating a new package on a corner of its vast booth.

The company was emphasising integration in its solution, with both a virtual environment rendered in the Unreal engine and graphical overlays and augmentation using established Avid graphics products, all managed from a single controller. It will also be fully integrated into the Avid production system, so the graphics could be driven from an iNews system, for example.

Perhaps the most critical element in creating convincing augmented reality is that the real and virtual elements must stay locked together, whatever the camera does. Avid elected to develop its own tracking, adding a very sci-fi array of pulsing infra-red emitters to the back of the camera, which are picked up by spotter cameras in the lighting grid.

Avid’s augmented reality demonstration featured its infra-red LED camera tracking

Avid’s augmented reality demonstration featured its infra-red LED camera tracking

Arguably the most impressive demonstration of the prospects for augmented reality was at Ncam. Starting as a camera tracker – technology which is still at the heart of its capabilities – it has now added a couple of very clever functions.

First is the ability to sense depth as well as the six degrees of movement (X, Y, Z position; camera roll, tilt and zoom). Depth perception means that augmented reality can be multi-layered, allowing presenters to walk around virtual objects and interact with them.

The second innovation is called Real Light, and it captures the lighting on the real part of the scene and matches it on the virtual graphics in real time. Lighting changes, reflections and shadows are all matched, making the augmented environment more convincing.