Virtual production offers filmmakers a real-time method of making movies and TV with actors, lighting and VFX all shot live in-camera. And, with a host of new facilities launching in recent weeks and refinements to technology, could it be integral to the way all films will be made?

It’s fair to say that virtual production (VP) has come a long way since it was pioneered for on-set VFX previsualisation on films like Avatar and Hugo. Nowadays, the technology can be found not just in film production but also in high-end television and advertising, and as part of live event broadcasts.

disguise-InnovateUK-VP

Virtual production is evolving and becoming an industry standard used on projects of all sizes

The big change of recent times has been thanks to a number of factors.

One is the incorporation of dedicated tools within game engines from Epic Games or Unity Technologies that are geared towards virtual production, in-camera VFX and real-time previsualisation as an end to itself, rather than studios having to adapt gaming technology to fit. Another factor is the continued advancement in processing power, especially on graphics cards (GPUs) used for rendering very high-resolution content.

The other big development has been the use of LED walls, which were notably exploited on the Disney+ streaming series The Mandalorian and Season 3 of HBO’s Westworld. Here the output from the real-time engine, often a whole CG world, is screened live on LED walls in the studio environment matched to a camera tracking system to enable ‘final-pixel imagery’ to be captured completely in-camera.

According to Neil Graham, head of virtual production at Sky Studios, the creative home of Sky Originals is using all of the above.

“Game engines have been core to our adoption as they enable so much efficient work to be done in both pre-production - including pitch-viz, previz, tech viz - and production phases, but we have been adopting new technology from pre-production through to LED capture with camera tracking,” he says. “It’s really a matter of finding, testing and using the right technology for each job – starting with the creative needs of the project.”

“We are using third-party shoot facilities at present,” he adds. “LED walls are being used to bring some location shots onto a controlled sound stage – locations that may otherwise not have been possible to travel a crew to. There are shots that are visually improved by shooting against an LED wall, such as car shots or motorbike shots and sequences with reflections in windows, metalwork, or helmets that can be captured in camera as opposed to being completed in post.”

New facilities
In June this year, Warner Bros. Studios Leavesden announced the launch of three sound stages, including V Stage, a new virtual production stage.

Warner Bros described V Stage as one of Europe’s largest virtual production stages, offering 24,000 sq. ft. of total space, the inside will feature a 7,100-sq.-ft. wraparound virtual production environment, using a matrix of more than 2,600 LED panels integrated with a powerful, state-of-the-art processing system. Bespoke to this design is a dynamic ceiling offering an additional 5,544 sq. ft. of LED panels with eight sections that work independently of one another, lifting and tilting to provide a new level of creative scope.

“The launch of V Stage brings a completely new dimension to filmmaking at WBSL, providing an exciting environment in which to deliver a new level of creativity,” said Emily Stillman, SVP, Studio Operations, Warner Bros. Studios Leavesden.

Meanwhile, MARS Volume opened its doors in Ruislip, west London on 2 August. As well as being a dedicated commercial facility for virtual productions it will also act as a centre of excellence for research and development into VP technologies and workflows. It has been designed and built by media technologists and visual pioneers, Bild Studios.

The facility offers real-time VFX production tools and technologies, with the option to create a 270-degree in-camera LED screen up to 38.5m x 5.5m, a 176 m2 out-of-camera ceiling LED screen and a 1400 m2 facility

MARS and Bild co-founder, David Bajt said: “Virtual Production is an expansion of the traditional filmmaking playbook, enabling studios to pursue greater creative experimentation while controlling the time and cost of production… the opening of MARS represents the next logical step forward for creative leaders in film, high-end episodic TV, gaming, advertising, music, fashion and other creative industries.”

Also in the UK, a £1m virtual production studio is set to open its doors in Wakefield, Yorkshire later this year.

As reported by TVB Europe, The Centre for Virtual Production is set to be the first of its kind in Yorkshire, and will use extended reality technology to offer film and TV producers the ability to build fully immersive video sets.

“We’ve been exploring the Unreal gaming engine and LED walls/projection and their use in production in order to be able to give expert advice on shooting and VFX to our clients as part of our offering,” says Jean-Claude Deguara, VFX Supervisor & Joint Head of 3D at Milk. “They’ve given us the potential to get high-quality results very quickly in creating environments for previsualisations.”

“We came to the conclusion that a VFX studio of our size should be able to create content and assets that are usable with volumetric LED walls/stages and then reused throughout the shot production process and even in different software packages,” he continues. “We’ve created tools to be able to move assets and plug-ins effortlessly in and out of Unreal, but our focus is on creating the content and environments for virtual productions rather than driving the shoot on the volumetric screens.”

disguise virtual production copy

Unreal Engine is being incorporated into third-party software, such as the disguise xR (Extended Reality) virtual production platform 

Global VFX studio Digital Domain has been using virtual production to varying degrees for a while now. “We are always curious about new hardware and software, but right now we’re focused on finding better ways to integrate what we already have into our pipelines,” says Scott Meadows, Digital Domain’s Head of Visualisation. “Game engines have countless uses for virtual production and a host of other uses in VFX, but they have their own ecosystem that pushes them more towards a traditional VFX pipeline. We’re working on ways to make that integration more far-reaching and comprehensive.”

“For us, it comes down to the ability to track motion capture data with timecodes, along with asset changes, then connect that data with our DCC software,” he adds. “That ensures we can iterate and track all the changes that occur on set. And, ultimately, that’s what makes a successful project.”

Virtual stages

Seen as an aid to sustainable production (there is less travel and less set building) and economical in terms of cost and time, virtual production is fast becoming an accepted way to add visual effects, and even has its own terminology; ‘brain bar’ has become a commonly used term for both the line of workstations controlling the virtual production and the team running them. Output for LED walls and on-set lighting are being controlled together by industry-standard protocols such as DMX. Remote collaboration, given a boost due to pandemic restrictions, lets multiple artists log in to the virtual production and control the lighting and add or edit the components of the CG environment on the walls in real time.

Game engines are being amped up with virtual production in mind. As well as the Virtual Film Tools created by Digital Monarch Media (now part of Unity Technologies), MiddleVR’s plug-in enables multi-display and cluster rendering for Unity, enabling use with LED walls.

The latest release version (4.2.7) of Unreal Engine offers improvements to Unreal’s Remote Control and brain bar support for multiple users, features to correct motion blur for travelling shots that accounts for the physical camera with a moving background, and higher in-camera resolution for the nDisplay rendering network which outputs the Unreal Engine scene on multiple synchronised display devices such as LED screens in a wall or ‘CAVE’ volume configuration. It also features optimisations for shooting with multiple cameras simultaneously and OpenColorIO colour management support in nDisplay to enable accurate colour calibration when connecting content creation in Unreal Engine to what the real-world camera sees on the LED volume.

disguise LA stage 1

The proliferation of stages and more ambitious set ups are giving writers, directors and production designers bigger canvas to play with.

Unreal Engine is being incorporated into third-party software, such as the disguise xR (Extended Reality) virtual production platform which enables video content to be displayed onto the LED walls surrounding actors or performers in real-time. Over 170 xR stages have been built around the world to support the demand for immersive real-time productions and the company has recently updated the xR stage at its Los Angeles office. The stage doubles as an R&D centre and support resource for a growing wave of film, episodic TV and broadcast productions relying on scalable, photorealistic real-time content, according to the company.

This new dedicated space is equipped with a full cluster rendering setup with GPUs (disguise rx render nodes) driving uncompressed real-time scenes from Unreal Engine and other render engines. It features an LED wall and Black Marble floor from Roe Visual, Red Komodo and Blackmagic URSA cameras, Stype RedSpy, Ncam and Mo-Sys StarTracker camera tracking systems and BlackTrax tracking, disguise vx media servers as well as lighting from Litegear and Arri.

In Germany’s Babelsberg Studios, Arri worked with Dark Ways, Studio Babelsberg, Faber AV and Framestore to set up Dark Bay, one of Europe’s largest permanently installed LED studios for virtual production.

The 7m-high LED volume with a 55m circumference comprises 1470 Roe LED panels surrounding a revolving stage. This removes the common physical limitations of shooting in an LED volume and can reduce or remove the time needed for a set rebuild. Complete sets can be rotated 360° in just three minutes and entrances and exits can be simulated through a trapdoor built into the stage. A modular LED ceiling, enhanced by Arri Skypanels, and the adjustable LED gate support any scenic set-up with interactive lighting and corresponding reflections. It also has seven permanently installed sprinkler nozzles for a rain rig, which allow for rain effects. The virtual studio is controlled via the brain bar, which comprises multiple high-performance workstations, powering environments created in Unreal Engine, with tracking supplied by Vicon Motion Systems and TrackMen.

“It’s new technology so there are no ‘standards’ per se, everyone is exploring and working through all the permutations and issues,” Jean-Claude Deguara, Milk VFX

Broadcast and live production services giant NEP also moved into real-time virtual production for film and TV with the recent launch of NEP Virtual Studios, following the acquisition of expert VP companies Prysm Collective, Lux Machina and Halon Entertainment.

It’s not just films. WPP recently used drones to scan and capture (to a 15 billion point mesh) a four-mile section of forest in the Scottish Highlands. It then used the power of Microsoft Azure cloud and the Nvidia Omniverse multi-GPU real-time platform to translate this into an incredibly detailed fly through simulation of the physical landscape on a LED wall for use in automotive advertising.

Challenges and barriers

Don’t ditch the green screen yet, though, as there are a few hurdles to greater adoption. “How wonderful to be able to shoot in a desert without leaving the UK, particularly in the current situation with Covid! But [virtual production] is not without its limitations,” observes Deguara. “It’s still very early days and it’s new technology so there are no ‘standards’ per se, everyone is exploring and working through all the permutations and issues.”

“A seamless tech pipeline is very important – low-budget productions can’t afford time to experiment or do R&D; they need to know the tech is going to work and know that they can capture exactly what they need in the short time that they have,” agrees Graham. “Large amounts of data need to be processed for LED screens to power full Unreal scenes. Connectivity and data-handling is a considerable job in and of itself with virtual production.

“The pipeline also needs to be proved [to be] a more economical option than the equivalent options – if VP is more expensive than green screen, projection or shooting on location from end to end, then it will be hard for lower budget productions to adopt. The industry needs to see side-by-side comparisons of what a VP solution captured in camera looks like versus a green screen with VFX in post for example.”

disguise LA stage 2

Meadows: ’Like any technology, the hardware and software used for virtual production will continue to improve.’

Meadows agrees that refinement is required. “The virtual production technology we are currently using is great and it allows us to do some amazing things, but we would like to see improvement in making it more user-friendly on set,” he says. “It can be difficult for data to move across multiple machines. One of the benefits of using a big vendor like Digital Domain is that we have experts that can help bridge any gaps that come up, but for many users, the lack of integration makes it inaccessible. A new form of metadata that improves upon timecode would be a solid improvement. And, while it’s not unique to virtual production, better hardware with less latency would help improve the on-set experience as well.”

Graham says there is high interest in virtual production from creative teams and department heads. “A lot of the tech is new or being quickly moulded for TV and film production, and although the pipeline is established, it is often being used by creatives for the first time. There are, therefore, frequent technical challenges every time that creative ambition demands more from the virtual production pipeline. As it’s still in its relative infancy, we need to plan time for testing/fixing to be done in advance to make shoot days as efficient as possible.”

“Like any technology, the hardware and software used for virtual production will continue to improve, which means more productions will be able to adopt it,” Scott Meadows, Digital Domain

As well as lowering of costs – both hardware and the cost of operators and practitioners – availability of crew, LED panels, Unreal artists and developers is also a challenge to greater adoption. “LED stage demand is outstripping supply,” Graham says. “It can be a very expensive business putting an LED stage together, and as the rental of studio space is currently at a premium, LED shoots can often be too costly for a production to consider.”

“Training the next generation of craftspeople, as well as giving current practitioners experience, takes time,” he adds. “Earlier this year we partnered with ScreenSkills, DCMS and the wider industry to develop and set new standards on how we can better train the VP practitioners of tomorrow.”

Virtual future

“It’s still early days for most of us, I see that [virtual production] will become a regular staple of the VFX world and it will steadily become more commonplace over the next year or two,” Deguara says.

“As the tech evolves, and as creative teams get more experience, it should allow faster, cheaper and more ambitious usage of the pipeline,” says Graham. “I can see creatives using game engines to visualise entire films or episodes before too long, and with the proliferation of stages and VP experience, more and more ambitious set ups – giving writers, directors and production designers a much bigger canvas to play with.”

“Like any technology, the hardware and software used for virtual production will continue to improve, which means more productions will be able to adopt it; the industry as a whole will become better educated in the process and will continue to push the boundaries of what it can do and how it can be used,” agrees Meadows. “You’ll see the creation of new virtual art departments, teams learning to scout in VR using virtual cameras and several new visualisation tools that will soon become standard on VFX-heavy films. It wasn’t that long ago that previs was seen as a luxury, but now it’s just expected. Virtual production will evolve in a similar way and become an industry standard used on projects of all sizes.”