The animated remake of the Disney classic employed a live-action film crew that could work inside virtual reality using traditional camera equipment to set up and execute shots in the animated world the same way they would be achieved in the real world.
For all its pioneering virtual production, the biggest breakthrough in The Lion King is its dedication to putting old fashioned filmmaking front and centre of the creative process.
“You can’t improve on 100 years of filming,” says Rob Legato, the production’s VFX Supervisor. “You start with the actors, you block out the scene, you select the lens, you compose and light the shot. If you shortcut that, it no longer feels like a movie.”
The Lion King, which will feature on the Big Screen at IBC2019 is, of course, Disney’s remake of its 1994 animated smash and also a follow-up to The Jungle Book. It re-unites director Jon Favreau with the VFX team led by Legato and MPC VFX supervisor Adam Valdez.
Indeed, for Technicolor-owned MPC, the story began as early as October 2016, while still wrapping up work on The Jungle Book campaign and months before Legato and Valdez won the Best VFX Oscar. They began discussing how the pipeline and methodology could continue to evolve from The Jungle Book to take their next project to yet another level.
Unlike The Jungle Book, where actor Neel Sethi (Mowgli) was composited into photoreal CG backgrounds with equally life-like animated animals, this time the entire film would be generated in a computer but shot with all the qualities of a David Attenborough nature documentary.
“Once the decision was taken to treat the movie as if it were live action it has to be filmed with a live action intent,” Legato explains to IBC365. “That means there are creative choices that you make only in analogue. You don’t make them frame by frame, you make them by looking at it and changing your mind, on the spur of the moment, in response to what is happening in front of you. A live action movie is the sum total of the artistic choices of a director, a cinematographer, an editor and many more. You have to find a way of recreating those on a virtual stage.”
Having studied at film school and served as a VFX supervisor, VFX director of photography and second unit director on films like Scorsese’s The Aviator, The Departed and Shutter Island; and Robert Zemeckis’ What Lies Beneath and Cast Away, Legato observes that his best work has not been about creating fantastic worlds but about believable ones.
“From the Titanic sinking to Apollo 13 launching any success I have had is about trying to fool the eye into believing something that looks like any other part of the movie.”
Cinematographer Caleb Deschanel earned his spurs for director Francis Coppola’s camera department on The Godfather and Apocalypse Now and has six Oscar nominations to his name, including The Right Stuff, The Natural and The Passion of the Christ. He worked with Legato on Titanic for which Legato also won an Oscar.
“Caleb is a fabulous artist but he has no experience of digital translation so my job was to navigate the mechanics of this for him,” explains Legato, who will speak alongside Deschanel at this year’s IBC conference.
Essentially that meant providing an interface for Deschanel between the virtual world and the tools of conventional filmmaking in such a way that the DP could call and operate a shot just like any other movie.
“We don’t want to throw him to the wolves just because he doesn’t come from a digital background,” Legato says. “His intuition about what makes a shot work is essential so we did everything in our power to help his ideas translate.”
That’s not to suggest that the virtual production techniques advanced for The Lion King were designed solely for Deschanel.
“What we’re trying to do is tap into our gut level response to things, our instantaneous art choice – what happens if I pan over here and then, if I move a light over this way, and put that rock there. Everything is designed to be done in real time, instinctively, instead of over thought and intellectualised.”
The key production advance to achieve this is VR. Trialled on The Jungle Book, its extensive use here enabled the filmmakers to collaborate on shooting the movie at nearly every stage as if it were a live action.
Where Avatar broke ground by giving the filmmakers a window on the VFX world — they could see the CG environment in real time during production as if they were looking at it through the camera’s viewfinder — The Lion King inverts that idea by putting the filmmakers and their gear inside a game engine that renders the world of the film.
“VR allows you to walk around the CG world like you would on a real set and put the camera where you want.” Rob Legato, VFX supervisor
If that concept sounds a little hard to grasp, Legato explains that it’s not as sophisticated as it sounds.
“VR allows you to walk around the CG world like you would on a real set and put the camera where you want. The physicality of it helps you to psychologically root yourself. You know where the light goes, you know where the camera goes, you know where the actors are – and all of a sudden you start doing very natural camera work because you’re in an environment that you’re familiar with.”
- Read more: Advice from the frontline of VR production
On a virtual stage dubbed the Volume in Playa Vista, LA, Favreau and his crew donned HTV Vive headsets to view 360-degree pre-built panoramas and pre-visualized animation.
Camera moves were choreographed using modified camera gear – cranes, dollies, Steadicam (even a virtual helicopter, operated by Favreau himself) – to allow the filmmakers to ‘touch’ their equipment with the motion tracked by sensors on the stage ceiling and simulated directly within the virtual world.
Effectively, they were making a rough version of the movie in real-time with graphics rendered using a customised version of the Unity game engine.
“Instead of designing a camera move as you would in previs on a computer, we lay dolly track down in the virtual environment,” says Legato. “If I want to dolly track from this rock to that tree the dolly has real grip and inertia and a pan and tilt wheel which is sending data back to the virtual environment. It’s not a facsimile. In that way you retain the imperfections, the accidents, the little idiosyncrasies that make human creative choices but which would never occur to you if you made it perfectly in digital.”
In other words, the storytelling instincts of artists with decades of experience making features is absolutely at the heart of this virtual production process.
“An amateur director can still put cut shots together but if they were given The Godfather to make then they would make a very different movie. A Martin Scorsese directed film is very specific to his point of view. It is not arbitrary, it is very serious art. And that is what we are trying to do here.”
Pre-building the virtual world
In order for any of this to work there was a major first stage which was pre-building the virtual environment.
The VFX team and core crew including production designer James Chinlund spent two weeks on safari in Kenya photographing vistas and data capturing foliage, the different species of plants and trees, and various lighting environments. They photographed 360-degree HDR images of the sky and the sun and built a huge library of these high-resolution images.
Back at MPC, the landscapes were modelled in Maya and, using a bespoke asset management system, integrated into Unity.
Working within the VR world, Favreau and Deschanel were able to explore the locations, effectively scouting for the places to shoot each scene.
Next, Favreau and animation supervisor Andy Jones would work out the mechanics of a scene – the rough blocking and approximate animation, again in VR.
“It just kept evolving and iterating until we get to something that we like,” says Legato. “We could all walk around in the virtual world together, and see things for the first time, look at things from different angles. You could be miles apart in the VR world but three feet apart on the stage but we could talk to each other and say ‘Why not take a look at what I am seeing?’ and we could all snap to their point of view.
“When that was done, Caleb would work with Sam Maniscalco (lead lighting artist) and they would light the scene for each shot.”
As with live action, the action was covered from different angles in order to provide a selection of takes for editorial. When editor Mark Livolsi had made his selections, the resulting shots were sent back to MPC along with the camera tracking data to finesse into final production quality.
Months of research went into character development. The final designs were sent to artists at MPC, who built them using new proprietary tools for improved simulation of muscles, skin and fur.
“They perfected the nuances of performance and lit it to look absolutely photoreal but the creative choices of what we’re shooting has been already selected in this looser live action virtual space.”
From 12,000 takes of photography, MPC delivered nearly 1500 shots (170,668 Frames or 119 minutes of final images) to Disney. 145 shots that got started were omitted in the process.
Because they wouldn’t be involved in the film’s principal photography, The Lion King’s human actors (including Donald Glover and Seth Rogan) were often asked to perform with each other in the Volume rather than simply reading script pages from a standing stationary position at a mic. Legato says the stage environment helped them deliver real, physical performances as references for the animators.
“We photographed with multiple Blackmagic Design cameras so the animators could see the intent of the actor,” said Legato. “But when they pause and they look and you see them thinking, you know that that’s what drives the performance. It’s much more informed than just voices only.”
The output from the cameras was routed over more Blackmagic gear so the team could watch playback or drop the footage directly into Avid for editorial.
“If we needed to throw an image onto a big screen so that the actors can get a sense of working with the pre-viz we could do that,” Legato says. “The Blackmagic kit was like a Swiss Army knife, a useful and necessary tool in this process, which fitted together whichever way we needed it.”
Da Vinci was used as a finishing tool and also to apply colour correction to the animation even before going through the Digital Intermediate process at MPC.
The main virtual camera was modelled on an ARRI Alexa 65 to enhance the film’s epic quality, paired with the Panavision 70 cinema lenses that were used on the reference trip in Africa.
But it is the tactility and authenticity of using actual camera devices and the instant feedback into the virtual environment which, Legato believes, gave them the ability to iterate the animation to a greater degree than ever before.
“I don’t know of anybody else doing it,” he says. “Even James Cameron isn’t shooting Avatar with VR in this way.”
“The ability to see in advance what only can be imagined until fully constructed will create better and better films, plays, concerts, and television shows”
The technology is already filtering into TV production, albeit the most high-end TV possible. Disney’s Star Wars spin-off The Mandalorian, which is created by Favreau, is using a similar set up albeit with the Unreal games engine.
“The ability to see in advance what only can be imagined until fully constructed will create better and better films, plays, concerts, and television shows,” adds Legato. “What you can now preview with great detail can only make for more exciting and original artistic expressions. So, in short, the encouragement to explore will take the advantages of VR to the next level.”