Virtual production is one of the most talked about areas in film and television. Steve Jarratt discusses its potential with DNEG’s Steve Griffith and VFX supervisor Kevin Baillie.
Virtual production can be defined in various ways, depending on the technology used and the project in question, but in general it’s the combination of physical and digital assets in a real-time filmmaking environment. It ranges from motion capture stages with real-time feedback of digital doubles and virtual worlds, to the latest advances in LED staging, with camera moves synced to computer generated sets.
And while virtual production is currently the big buzzword in content creation, Steve Griffith, Executive Producer of the virtual production division at DNEG, is keen to stress that it’s just a visual effects process; another tool in the filmmakers’ arsenal. “In-camera compositing is really the best thing to call it,” he says.
Griffith explains that virtual production can offer a range of benefits, alongside obvious things like cost reduction and minimising carbon footprints. “The way that we’ve been trying to describe this to filmmakers is that we’re just now additional tools for your production designer, your director, your DP and your visual effects department.
“So for a DP, we become a lighting tool. For a production designer, we become an extension of the set pieces that they create. And we can, in some ways, compress the visual effects schedule, and provide marketing and editorial with all of that material much further up ahead.”
Indeed, Griffith believes that marketing is one of the biggest beneficiaries of virtual production technology. Studios want to release trailers and begin building interest, and the sooner the marketing departments can start putting material out there, the better – especially for visual effects-intensive movies with long post-production schedules.
The use of virtual production tools can eliminate the temp shots that VFX vendors have to rush through, and avoid the pain of showing execs incomplete green screen tests. “The more you can fill that edit, and level it up with content that’s more accurate to what your film is going to be… that is a huge event.
“We’re having a hard time putting real dollar signs to what that means, but I think the intangible benefits will start to be seen more and more, in terms of getting exposure of your film, and getting real decisions to be made about your films earlier on.”
Unreal Engine power
The advent of game engines in the generation of real-time visuals has been transformational in virtual production; initially as a means of viewing and interacting with mocap performances, and latterly for the creation of real-time digital sets. But the current challenge is to have those visuals fully ray-traced, which avoids the need for texture baking while offering the ability to change lighting on the fly.
“The implications across virtual production and VFX are huge,” says Griffith, “and everyone’s clamouring to get that pipeline going. It’s hard, because in some ways, it’s still not quite there yet. Organic elements like trees and creatures and crowds and things like that are still a ways off, but we’re so close. We’ve been wanting this real-time solution for so long in our industry, and it’s exciting to be at that stage where we’re testing it out.”
For Griffith the next steps involve a combination of more powerful GPUs for real-time ray tracing, and the ability to run live simulations – cloth, hair, fluid dynamics. “Once we can start doing those in-engine, and the hardware supports it, that to me is when we win. I still think there’ll probably be some compositing elements to all this, but then the compositing tools will change to fit the medium too.
“And then I think that there’s work to be done on how we capture data. Because in the traditional visual effects process, you’re rendering out shots, you have all these versions… Whereas in a real-time workflow, we’re going to need to totally tweak our process in terms of how things are approved, and how we move through the iterative process.”
“ILM did a great job of promoting and marketing its [StageCraft] technology. But in some cases, it was over-marketed. People think it’s this magic button.” - Steve Griffith
While virtual production is largely defined by the technologies that power it, Griffith suggests that some of the current pressure points are less about the performance of the technology and more about the physical aspects of the volume itself.
Take the LED panels used in virtual production sets, where the obvious improvements are brighter screens and higher pixel density. But Griffith argues, “You never really end up turning the screens all the way up, because of the sensitivity of the camera sensor. And the camera’s always going to be away from the LED screen to a certain extent.
“So, everyone keeps saying, ‘Let’s go find a tighter pixel pitch.’ But now you’re having to output more resolution to the screens, and the issue isn’t that we want to put cameras closer to the screen; the issue is we want to have cameras further away, because humans are a certain size, cars are a certain size. There’s always going to be the limitation of how large the volume can be, the limitations of where you can move your camera, how much you can integrate the environments, and sometimes how long it takes to build these environments.”
Ultimately, Griffith sees the worlds of virtual production and VFX beginning to coalesce, with video game technologies and tools merging with the techniques that visual effects have been using for years.
“The biggest difference between the industries is that one was non-photorealistic and one was photorealistic. But when the technology isn’t the limitation, then storytellers are going to be able to say: either you’re driving the story, or I’m telling the story, but the medium doesn’t really matter any more. Currently we have this nine-month production schedule, because it has to go linear from assets to rendering and compositing… At some point, that will all be one step. And you just tell your story through a particular camera or cameras.”
Bridging the gap between physical and virtual
“For me, the definition of virtual production is anything that uses real-time technology in a filmmaking process,” says Kevin Baillie, VFX supervisor and second unit director. “So it’s not just LED walls, it’s not just SimulCam… It’s all of these things. And I think Unreal and Unity, the game engines really improving in terms of quality, have allowed for real-time technology to bridge the gap between [digital and] the live action departments that have over a century of culture and experience and knowledge baked into them.”
Baillie believes that advances in virtual production have been following two separate arcs: quality and accessibility. For the latter, he remarks how previously a studio would need to mount a massive effort in terms of headcount and technology development to even start thinking about doing virtual production. “Now, the commercially available tools are so good and so robust that you can have a team of one running a virtual production setup. You can start lean from a team size perspective, and then ramp up as needs be.”
For the quality aspect, Baillie refers to his recent work on Disney’s upcoming Pinocchio, for which they shot and cut together almost the entire movie in Unreal before a single frame of live action was shot, or a single set was built. The use of ray tracing enabled them to see the shadows and reflections in an interactive way and act instantly on that feedback.
“Instead of having that video game-looking [imagery], like we were forced into having before this technology, it’s now like what would happen in the real world. And that’s useful from a technical perspective, because if you design a set, and you want the light to interact with it in a certain way, and it looks that way in Unreal, you can say, yeah, it’s going to work more or less that way when I build it for real. So, it’s useful for cinematographers, production designers, and so on.”
This new-found degree of realism also helps from a creative standpoint too, with assets that artists can relate to visually and emotionally. “That level of creative engagement is something that really only comes with quality,” says Baillie. “And that has enabled departments – cinematography, production, design, even costume design – to be invited into the process, because now they can look at the screen and really relate to what they see.
So, I think it’s more than just pretty pixels; it turns it from a useful novelty tool into a meaningful source of information and interaction point for people that have never really known about or understood or interacted with the digital process before.”
“One of the biggest bottlenecks in virtual production right now is the people to do it.” - Kevin Baillie
As the field of virtual production improves in speed and quality, Baillie foresees a time when you could be tweaking the edit right up until the soundtrack is finalised, which would be both transformative and liberating for the right kind of filmmaker.
“I think quality, speed, flexibility, those are all things that, as they improve you’ll see the world get closer and closer to the point where a lot of filmmaking started, where it’s the director and a very small team collaborating together on a project. And the filmmaker is getting closer and closer to being able to touch the final pixels on the screen, directly interacting with them. Which in the world of traditional visual effects for the last 30-40 years, has very much not been the case. It’s been an obscured process, hidden behind a big black curtain of visual effects companies. And I think that is changing and will continue to change quite rapidly.”
Naturally as hardware improves, so the software becomes more capable, producing visuals with ever-increasing fidelity. “And the better the technology gets, the more real-time is actually going to be real-time, and the better things are going to look,” says Bailie. “It all goes hand in hand with itself. And another thing that is going to transform virtual production even further is the implementation of artificial intelligence in two parts of the process.
“First, it’s going to come in terms of things like AI de-noising that helps to make a fast, low quality render look better and help with interaction. But as we continue to delve further into it, these sort of experimental AI things that take a good video game-looking thing and make it look like a photograph, or other things of that nature that transform a rough visual or an input of text into a beautiful image. Those are going to start being integrated into the process as well. And I think that will result in things that we can’t even imagine right now.
“Ultimately, I think that the closer that we can get to shooting actors on a holodeck – for lack of a better word – while retaining flexibility in post… that’s the dream.”
Education, education, education
While the future of virtual production will always be tied to advances in hardware and software, with bigger and brighter LED screens, more capable GPUs and AI-driven processes, in the short term the biggest advance will be in educating filmmakers in how to operate these systems, and film executives in how best to utilise the benefits on offer.
One of the biggest hurdles Steve Griffith and his team have had to overcome is the necessary shift in mindset when employing a virtual production pipeline. “The entire group of filmmakers, the core decision-making group, which is your line producer, your executive producer, your DP, director, production designer and your VFX team, that group needs to completely buy in. Because weeks before you shoot, you’re making decisions.
[Currently] that’s not how movies are made: we figure it out as we go, then we see a first cut, then we make changes, and then if we have enough money, we reshoot entire scenes. Whereas with virtual production, you’re saying, ’These are the environments I want; I’m going to be doing it on this stage, with these walls; I’m committed to these decisions, and that’s what we’re going to do.’ It’s premeditated. And so if you don’t have a production team, or a filmmaking team that’s bought into that methodology, then it doesn’t work very well. And people have really terrible experiences with virtual production, because they just haven’t committed to what needs to be done.”
Griffith suggests that the biggest bottleneck is in education: training people in how to use the technology, and ensuring that the people who want it understand how best to employ it. “A lot of what we’re doing is evangelism and education,” he says. “ILM did a great job of promoting and marketing its [StageCraft] technology. But in some cases, it was over-marketed. People think it’s this magic button. And so we’re trying to clarify things with our clients. I think, in a year or two from now, when there are DPs and production designers and directors that understand it better, that’s when we’ll really have an effective use of the technology. And hopefully then the actual machine and software side has caught up and given us some more power.”
Kevin Baillie agrees: “I think the future of virtual production is more or less an extrapolation of what’s been happening over the last three or four years, where improved access to it will be really transformative for commercials, small productions, up to the very largest ones. And part of the reason why it will be transformative is because I think one of the biggest bottlenecks in virtual production right now is the people to do it. It’s so new, it feels like computer graphics did in the late ’80s, early ’90s, where everything that’s done is this new ‘Aha!’ moment, this new exciting discovery.”
He refers to Light & Magic, the Disney+ documentary about the history and evolution of ILM. “You look at that, and it was very clear none of those people knew what the hell they were doing! They were all just figuring it out and having a great time as they went along, breaking new ground. And I think we’re in the same place with virtual production right now, where on every project – certainly the ones I’ve used virtual production on – it’s been, ‘Alright, let’s figure out how to do this the best way possible from the ground up.’”
Baillie remarks that the pace of technological advancement means that very little is re-used from project to project, with each new year offering fresh opportunities for improvement. And, as a consequence, no-one can rightly claim to be the ultimate authority on how to do virtual production.
“We’re all just still figuring it out,” he adds. “And there are so few people who have done it before and who are true experts in the field. We need more access to create the talent pool in order to really fulfil its potential. So I think access and talent pool are things that will probably make a bigger difference than anything else in the future.”
To find out more about virtual production, check out the related articles below: