Virtual production technology has reached a new level of sophistication and many productions are now being written with a Volume in mind. However, it is not right for all stories and some tech gremlins remain, writes Adrian Pennington.
LED volumes for the production of in-camera visual effects (ICVFX) are now a maturing technology, according to a post-IBC show report by Futuresource Consulting.
The report also concludes that virtual production (VP) and extended reality (XR) have gone past the “hype phase” of their development, with the performance of systems now being refined and the technology becoming accessible to more users.
A key factor in this is increased ease of use, with installations no longer reliant on a technical team with specific, specialised technical skills to operate the equipment. Manufacturers are now producing VP systems that are not as challenging to run, with software and firmware upgrades appearing on a regular basis to enhance the production of content on LED volumes.
ARRI Stage London is seeing this trend first hand. Filmmakers who have used an LED wall are returning with the understanding of what it can do, and significantly, are now working from scripts written for virtual production.
“Productions know they can save money using virtual production,” said Rob Payton, production specialist, ARRI Solutions. “The big shift in mindset is filmmakers are now thinking of the technology as a tool to create things they were previously unable to. The best results we see now stem from scripts written for the volume.”
The Chemical Brothers’ promo for their latest single ‘Live Again’ is a prime example. Produced by Outsider with directing duo dom&nic, the promo follows a dancer emerging from her trailer into a series of environments—with several scene transitions taking place live, within one continuous shot.
After the initial rush into VP and a period of “learning by doing”, confidence in the technology has grown. Instead of feeling confined in a tech-led space, Payton believes filmmakers now appreciate the LED volume for its creative freedom and production efficiency.
“There were some misconceptions that the whole look would be controlled by a VP supervisor and VFX,” he said. “But once cinematographers have experienced the workflow, they realise they retain complete creative control over the imagery.”
ICVFX: Virtual Production - Test and learn
Whilst some DPs might be on their second round of VP work, most filmmakers need to ‘try before they buy’ into the technology. It is why new VP spaces operated by Sony in Pinewood, by Anna Valley in West London and by kit hire firm CVP in Fitzrovia are primarily for training purposes. They give producers and DPs an idea of what to expect and test out scenarios.
“Systems are more interoperable with each other than they were a couple of years ago but issues do remain,” said Callum Buckley, Technical Consultant, CVP. “We can provide guidance for users to help them understand whether product ‘A’ work wells with product ‘B’ and we can advise on whether their budget is being spent wisely.”
“We see it as a fantastic tool but it is not the be all and end all for a production,” reported Anthony Gannon, COO, Garden Studios which has hosted over 85 productions at its Volume stages ranging from adverts to multi-cam drama. “It’s not as expensive as it was and the market has become more competitive.”
The rate card for a day on Garden’s Volume stage is around £10K which also includes the studio’s full service. It will advise on whether a project is actually more suitable for a conventional studio. “It’s important to understand what is realistically achievable,” said Gannon. “VP has to be fit for purpose.”
ICVFX: Virtual Production - Fit for purpose
Volume work is understood to be hugely beneficial for shooting scenes featuring moving cars where the virtual backgrounds and additional physical lighting can deliver realistic reflections on the car surface or interior.
Examples include the neon-drenched driving sequences in Marlowe (2022) and the equally neon drenched car work in Blonde (2022), albeit this is shot black and white. Mank, another black and white shoot, also employed virtual production for its driving scenes. However, the majority of these films were shot on regular sound stages or on location.
Read more Behind The Scenes: Marlowe
If the project is VFX heavy, like the Chemical Brothers’ video, a Volume can work wonders in terms of efficiently corralling all the VFX into one space for the director, DP and acting talent to see in what they are producing, live.
With 75% of its action taking place in mid-air the production for AppleTV+ mini-series Hijack made extensive use of VP. The drama was shot on four Volume stages in the UK, all operated by Lux Machina and featuring a combination of LED configurations. This show could arguably not have been made with the degree of authenticity it has without VP.
By the same token, extensive use of a Volume also has its own aesthetic and, despite the claims for realism which it undoubtedly has over green screen, can’t yet replicate real world situations as accurately as being there.
Some critics have noted the relative lack of scale in the lightsaber duels of recent Star Wars TV productions and chalk that up to limitation of the Wall.
Star Wars: Andor is an exception. Perhaps the best live-action Star Wars spin-offs since The Mandalorian, it made a virtue of shooting conventionally rather than using ILM’s StageCraft volume.
“We definitely discussed [using virtual production] but decided it did not lend itself to what we were trying to do,” John Gilroy explained. “Virtual production frees you up in some ways and it limits you in others. In the production design and look we wanted to go more realistic and therefore to shoot in a more old-fashioned way.”
Cues to the look of Andor came from Star Wars: Rogue One, which Gilroy edited for director Gareth Edwards. The same lived-in look using real-world locations was prioritised by Edwards for his latest sci-fi feature The Creator.
“There was a little bit of Volume work at Pinewood but very low,” Edwards admitted [as quoted in this Frame Voyager video]. “And if you do the maths, if you keep the crew small enough, the theory was that with the cost of building a [single, conventional] set which is typically like $200,000 - you can fly everyone anywhere in the world for that kind of money.”
Numerous VFX houses were contracted to work on The Creator, including ILM, Unit, Folks VFX and Outpost VFX using a method that flipped recent VFX workflows on its head. Edwards shot the material first on location then retrofitted backgrounds, sci-fi craft and AI robots to achieve a more organic and less expensive method of production.
“We’re not saying that there wasn’t work done with the backgrounds, but VFX are not having to be fully digitally recreate every scene,” he said. “This allowed more effort to be put into making the robots of this world feel even more real because locations, props and characters are already in the shot. Often there was no additional work or relatively minimal labour needed in finalising a character or environment.”
Production wasn’t all plane sailing on Hijack. Some issues with lighting in the Volume needed finessing in post, according to VFX Supervisor Steve Begg.
“Being a TV show, it was shot in a mad hurry (i.e. with no testing time) and although everyone was initially quite happy with the results, after closer scrutiny we saw all sorts of issues that needed fixing. Lots!”
Scenes featuring a Eurofighter cockpit in the 270-degree Volume on a motion base proved most problematic.
“I’d anticipated we’d have problems with the reflections in the visors so we had them high-rez scanned in order to get a good track and replace everything in them, with CG versions of the cockpit and the pilots arms and sky,” Begg explained. “The moment the reflections were sorted the shots really started to come together with added high-speed vapor on top of the Volume sky along with a high frequency shake. I stopped them doing in-camera as I had a feeling we’d be fixing these shots, big time. If we had they would have been a bigger nightmare to track.”
That said, Hijack is an ambitious series that would probably not have been attempted a couple of years ago and was made by filmmakers pushing the tech to its limits.
ICVFX: Virtual Production - Lighting Evolution
Achieving accurate colour, especially for skin tones, is a problem that bedevils cinematographers working in Volumes but manufacturers are coming up with solutions to tackle it.
Most LED displays follow the standard RGB format, but new panels are being released that incorporate a fourth emitter of white LED. ROE has one such product called RGBW which it claims offers greater colour accuracy and minimises the amount of colour correction required in post.
A rival technology was launched by INFiLED at IBC. Branded ‘Infinite Colors’ the innovation is said to improve a variety of LED applications by allowing full variations in tone, saturation, and colour appearance in white light and custom colours.
LED lighting manufacturers are also moving in this direction. The MIMIK tiles from Kinoflo comprise not two additional white bulbs as well as RGB and are typically used as a reflective surface with the key difference that you can also play back video content through them for real life reflections. CVP has a set at its Great Titchfield test facility.
“Traditional LED panels won’t light skin tones well, their Colour Rendering Index is basically zero in the pinks whereas with Kinoflo you can light costume and skin tone correctly because they afford a much more balanced colour profile,” said Buckley. “The difference is staggering when you see it in the flesh.”
Another downside of a Volume is that it can’t yet mimic the sun, even if the rendering quality is as good as the sun.
“It’s not hard light, it’s not a million miles away. It’s not a point source,” said ROE Visual R&D Manager Tucker Downs in a tutorial on colour science. “And if you’re thinking about how a streetlight casts light down, you have a shadow, well, you can’t recreate that in virtual production. A Volume can only really provide the global illumination and indirect illumination.
“Could we get a fully spectrally managed colour pipeline and virtual production to do spectral rendering with distributed rendering technologies? I think these are the things that you can look forward to and we should push for in the next ten years.”
ICVFX: Virtual Production - GenAI
AI-driven software Cuebric enables filmmakers to dream up camera-ready media in seconds, for playback in a volume.
“Five percent to 20% of budgets today goes into reshoots because even when you’re working with green screen or virtual production, reshoots are often the only way to achieve the director’s intent,” said Pinar Seyhan Demirdag, Co-Founder of Cuebric developer Seyhan Lee.
They calculate that for a medium size picture, reshoots cost a production $375,000 on average. For bigger budget shows that rises to an astonishing $24 million.
“If we were to save the industry even a fraction of that using AI it could help funnel those funds back into creativity and cut out unnecessary labour,” Demirdag said.
Further out and Generative adversarial networks (GANs) could be used to achieve higher quality footage. French postproduction giant Technicolor is testing this and also also using ChatGPT for concept ideation.
Technicolor’s labs are testing machine learning to standardise the more repetitive tasks like rotoscoping and exploring how AI tools like ChatGPT, Cubric and Nvidia’s Omniverse might apply to games engines and virtual production.
“We’re using Stable Diffusion and Midjourney quite a bit for content creation generation ideation moments,” explained Mariana Acosta, SVP, Global Virtual Production and On-set Services, Technicolor. “We’ve created a few TV spots that actually use Runway to generate backgrounds including one for Salesforce which very much played into the AI aesthetic.”