It’s rare when something can be truly said to represent a revolutionary approach to filmmaking. Attendees of a jaw-dropping Big Screen session at IBC2018 got to see for themselves how game technology is fundamentally changing the way films are made.

Virtual production, championed over the past ten years by the likes of James Cameron and Robert Zemeckis, is a truly seismic movement.

The Jungle Book: First film to deploy a game engine

The Jungle Book: First film to deploy a game engine

It puts the tools of VFX and pre-viz artists at the hands of directors and DPs to allow them to visualise and interact with CG sets and elements during shooting.

The next great leap forward is interactivity, using game technology for rapid development and a far greater ability for decisions on set to change the production environment in real-time.

Game on
Habib Zargarpour, CCO and co-owner of Digital Monarch Media, responsible for VFX on films such as Blade Runner 2049 and The Perfect Storm, is one of the pioneers of this new advance, being the first to deploy a game engine in a film on The Jungle Book (2016).

“The biggest change [since Hugo or Avatar] has been in the area of real-time engines bringing in higher visual quality and lighting, and increased complexity possible with things like facial tracking,” he says.

The big leap has come with the increase in power of game engines and their use in VFX software development.“It wasn’t till VR came about when suddenly the VFX facilities started to look into them. Now when I approach a project at least there is some awareness.

All Together Now
Adam Myhill, Head of Cinematics at Unity Technologies was on stage with Zagrapour at IBC, co-presenting the session, ‘Revolutionising the World of Film in Real Time’

“On a real film set, the director can touch everything, saying ‘move the light’, telling the actors where to stand, and so on, but in CG, it’s been so laborious, so disciplined – you’ve got the lighting department and animation and everyone in these silos,” says Myhill. “It all gets rendered out – it was 90 hours a frame for Avatar and crazy numbers like that. Then it all comes together, but people only get to see it so far into production.

IBC2018: Adam Myhill speaks to IBC TV

“With a real-time engine, you see everything from day one,” he continues. “It’s not this sequential pipeline – everyone is working at the same time. The director has every department at their fingertips. They can move an actor or the camera or make the sun brighter – anything they want to do, it’s all right there.”

Unity’s toolset includes Cinemachine, a modular set of camera tools that Myhill invented. “On a film set, the camera operator has a relationship with the subject,” he says. “The camera follows the subject but there’s a little bit of lag, because the camera doesn’t know exactly what [the subject] is going to do. That [provides] the texture and the believability of a live, real camera.

”In CG, [generally] there’s no relationship between the camera and subject. People are just keyframing the camera, and it’s looking at the subject, but if that thing moves, the camera doesn’t know. There’s no relationship. With Cinemachine the camera knows what we’re trying to do. So, if that thing moves you still get the shot. You get back the relationship between the camera and the subject.”

Myhill says that filmmakers can experiment - and can make all their mistakes - on a set built and lit completely in Unity. “And multiple people can get together - one with a Vive on, one with an iPad, and all in the same space together, it literally is a virtual set,” he says.

“It opens a whole world where they can see things, understand the placement of the sets and how lighting affects the scene,” agrees Zargarpour. “DPs can implement the lighting the way they want it to be interactively, while the Director or Production Designer collaborate like on a live action set. It’s a much more creative process.”

Heroic endeavours
When we met at IBC, Myhill’s team at Unity had just finished a project with Disney Television Animation – Baymax Dreams. “It was three two-minute episodes for a show that they have based on Big Hero 6. We just finished it, just days ago,” he says. “The actual rendered frames from Unity are what’s on the show. There’s no post, no comp, no colour grading after the event – it’s the literal frames that are rendered out of Unity that’s on TV.”

Source: YouTube / Unity

Behind the Scenes on “Baymax Dreams” (Made with Unity for Disney Television Animation)

Myhill puts the increase in speed down to the advances in graphics cards, as well as advances in Unity.

“We rendered that whole show with the new renderer in Unity, the HDRP [high definition scriptable rendered pipeline],” says Myhill. “Baymax Dreams was one of the first big projects using it. The director [Simon J. Smith] was saying crazy things, like, ‘this is first time that I’ve gotten close to my characters in CG’.

“Unity made a short film, called Adam, that garnered a lot of interest. We had Neill Blomkamp direct two sequels to it for us,” continues Myhill.“We saw an increase of filmmakers at our last Unite event in Europe, they see the real time revolution coming. We have filmmakers coming into it in two categories: the first is ‘I’m going to do my whole show – it’s all CG and I don’t want to render it out with a render farm and rent all these machines’.

”The other [scenario] is ‘I want to do virtual production’, like Habib does. They used it on The Jungle Book, where Jon Favreau could be like ‘I want more tigers to the left’, or ‘move the sun over there’. We had CG lighters working on the Adam sequels from Neill Blomkamp’s studio who said: ‘I feel like I’m cheating’. When you move a light, there’s no need to do a test render, submit it and check it tomorrow. It’s not like that anymore - you just move the light!”

Blade Runner 2049

Blade Runner 2049

“More and more complex assets can now be used in engines, approaching or in some cases identical to the final assets used by the VFX teams,” says Zargarpour. “We have had several such cases. On Blade Runner 2049 we used the actual Spinner used by Paul Lambert’s team at DNeg.”

At the IBC2018 session, Zargarpour and Myhill demonstrated how such assets could be controlled in real-time to block out a series of camera moves, with impressive results.

“On Blade Runner 2049, the director Denis Villeneuve was able to make use of our system to re-imagine some shots and create new ones,” Zargarpour reveals. “In one case he flew K’s Spinner to perform a flight path through the Trash Mesa ships and by doing so realised the move wasn’t humanly possible due to the tight turns. So, he came up with a different path.

”He then performed the camera move while connected to the Spinner seat [inside the CG model]. The shot has a great sense of liveliness to it. It was amazing to work with him and John Nelson on that project and help make the film better. Any time a director or DP can directly control a shot you will get authentic results.”

“On Blade Runner 2049, the director Denis Villeneuve was able to make use of our system to re-imagine some shots and create new ones.” Habib Zargarpour, Digital Monarch Media

Games Without Frontiers
“If you’ve never done anything in 3D before and you download Unity it’s a big programme, there are a lot of knobs and switches,” says Myhill. “There’s a vernacular, it helps to know what a model is, or a texture, or a shader. But if you’ve done all that, or a bit of that stuff, there’s a step [to take] and then we think you’re off and running. We’re working right now to make that first step as low as possible.”

Zargarpour says things in this space are evolving very fast. “We’ve been switching devices and tech every six months,” he says. “In 2010, while at Microsoft Xbox, we built a real time GPU Ray Tracer using 32 NVidia graphics cards. This is now possible with only one NVidia card. We used to need a mocap stage or optical tracking cameras and now we just use depth sensors and AR-enabled sensors to track cameras. The software side has had just as much impact on the process, with AR/VR opening that area with research and investments.

“I believe this direction will change many of the aspects of film and virtual production, blurring the line between real and virtual sets in real time,” he concludes. “We’ve known that was possible and it’s now very accessible. I am proving this myself on a feature called Squadron. These are very exciting times!”