Massive LED walls and gaming engines were used to speed up the creative process and reduce the time needed in post-production for Disney+ Star Wars spin-off The Mandalorian.

The Mandalorian index

The Mandalorian: Groundbreaking virtual production

Star Wars feature films are shot on massive sets occupying large soundstages at Pinewood in the UK, supplemented with exotic location work in Tunisia or Ireland and copious VFX from the likes of Industrial Light & Magic (ILM).

But for the TV spin-off, Disney+ used a groundbreaking virtual production methodology that radically shrunk the footprint, eliminated location shoots and essentially performed real-time, in-camera compositing on set.

The Mandalorian was shot on an LA stage surrounded by massive LED walls displaying dynamic digital sets. The integrated suite of technologies included a motion tracking system from Profile Studios and Epic Game’s Unreal engine.

While all these elements have been used in various combinations before, deployment for the fast-paced nature of episodic television had never been attempted.

“Most of the creative work is done in pre-production instead of post,” David Morin

For example, the art departments on The Meg, Le Mans 66 (Ford v Ferrari), Ad Astra, and Rocketman all used Unreal Engine for previsualisation. The Lion King’s creative team used the Unity engine for several processes including location scouting. Joker and Murder on the Orient Express included scenes lit by video walls.

“This project demonstrates the most multi-faceted use of our technology yet,” says David Morin who heads up Epic’s LA Lab.

As pioneered by The Lion King director and The Mandalorian showrunner Jon Favreau, multiple people including the art designers, production designer Andrew Jones, visual effects supervisor Richard Bluff and cinematographer Greig Fraser ASC ACS were able to collaborate in VR prior to principal photography in order to choose locations, block and light scenes.

Natural workflow
“The defining characteristic of this workflow is that most of the creative work is done in pre-production instead of post,” explains Morin.

“If you want a photoreal virtual world on set you have to build it before going on set. That shift requires adapting the VFX workflow so that you can use games engine and virtual reality tools to decide ahead of time where to put the camera and, if there are multiple directors shooting in the same world over a series, they need to agree on continuity.”

The Mandalorian executive producer and director was Dave Filoni working with episode directors including Bryce Dallas Howard and Taika Waititi.

HUC-003903_R.pip

On set: Shot on an LA stage surrounded by massive LED walls displaying dynamic digital sets

“The good news in this transition is that it brings workflow back to something akin to traditional filmmaking,” Morin says. “You have to build sets before you can shoot them in live action and now that’s the same in our VFX workflows. Virtual environments have to be built before the deadline for principal photography. This workflow is very natural for filmmakers.”

The location backdrops were drafted by visual-effects artists as 3D models in Maya, onto which photographic scans were mapped. Photogrammetry teams headed to Utah and Iceland among other locales to shoot these plates which wound up comprising about 40% of the show’s final backdrops. The rest were created as full CG by ILM and on the studio backlot.

Actors in The Mandalorian performed in a 20-foot high, 270-degree semicircular LED video wall with ceiling and a 75 foot in-diameter performance space, called the Volume.

This is where practical set pieces were combined with digital extensions on the screens. The technology’s real innovation is that when the camera is moved inside the space, the filmmakers have the ability to react to and manipulate the digital content in real time.

Visuals in parallax
“When the camera pans along with a character the perspective of the virtual environment (parallax) moves along with it, recreating what it would be like if a camera were moving in that physical space,” Morin explains.

By the time shooting began, Unreal Engine was running on four synchronised PCs to drive the pixels on the LED walls in real time. At the same time, three Unreal operators could simultaneously manipulate the virtual scene, lighting, and effects on the walls. The crew inside the LED volume were also able to control the scene remotely from an iPad, working side by side with the director and DP.

“We wanted to create an environment that was conducive not just to giving a composition line-up to the effects, but to actually capturing them in real time, photo-real and in-camera, so that the actors were in that environment in the right lighting — all at the moment of photography,” Fraser explained to American Cinematographer.

Any reflection and refraction of light from the panels bounces off surfaces and behaves as if it were being shot for real on location.

“By contrast on green screen you had to jump through so many hoops to achieve the desired lighting effect,” Morin says.

Lighting for real
The amount of green and bluescreen photography had famously blighted Star Wars episodes 1-3.

“Green screen or black box virtual production is a very intellectual process that requires the actors to imagine how things will look and everyone else to figure it out later. If the director changes the backgrounds in post, then the lighting isn’t going to match and the final shot will feel false. Here, suddenly, it’s a very natural thing. The video walls bring us back to making decisions and improvisations on the set.”

The shiny suit of the title character, for example, would cause costly green and bluescreen problems in post-production.

HUC-006843_FD

VFX: ”If you want a photoreal virtual world on set you have to build it before going on set.”

Instead, the cinematographers (including Barry Baz Idoine who took over from Fraser after he had set the show’s template) were able to work with the reflections of the LED lighting. For example, the DPs could request a tall, narrow band of light on the LED wall that would reflect on Mando’s full suit, like the way a commercial photographer might light a wine bottle or a car — using specular reflections to define shape.

“LEDs are emissive surfaces so you can essentially create any illumination pattern that you have in real life,” Morin says.

“It’s not just reflections, it will generate the entire light for the scene. You can also exclude the virtual world from being an effect and only light the real set. Everyone is still learning how to take advantage of these possibilities.”

The set was rigged such that the 90-degree open area behind the cameras could be enclosed by two additional flat panel LEDs enclosing the set in an entire, lit virtual environment.

“The environments start to become bolder and deeper as the team began to understand the size of the virtual space they were working in,” says Morin. “A 75-ft set is impressive, but the environment can look a thousand times bigger. You have an infinite sense of infinity. You can have a football pitch sized spaceship hanger or a desert vista with the sunset hundreds of miles away and the illusion is impressive.”

For the actors this approach was beneficial since they could relate more to the story surroundings, for instance knowing where the horizon is, even if the screen was only in their peripheral vision.

“A majority of the shots were done completely in-camera,” Jon Favreau

That said, the illusion will only appear perfect when viewed from the perspective of the motion-tracked camera.

“If you’re not at the camera’s focal point then it looks weird and a distortion of reality. That’s a signature of these kinds of sets.”

Practical elements, such as the fuselage and cockpit of the Mandalorian’s spacecraft Razor Crest, had to be built with attention to their light reflective properties in order to react to the LED illumination accurately, even if it was not in shot.

The production camera’s position was tracked by the motion-capture system via infrared cameras surrounding the top of the LED walls. When the system recognised the coordinates of the camera, it rendered the correct 3D parallax for the camera’s position in real time. That was fed from the camera tracking system into ILM’s proprietary Stagecraft virtual production software, which in turn, fed the images into the UE4. The images were then output to the screens.

Perfecting workarounds
The whole system also had a time delay of seven frames says Morin (other reports suggest 10- to 12-frame latency) between the camera and the virtual world as a result of the processing needed to generate content in the software and hardware stack. Less than even half a second, this nonetheless, occasionally resulted in the camera moving ahead of the rendered field of view.

“We will reduce the lag further and there are a number of targets we can optimise and eliminate,” says Morin. “It’s an ongoing process.”

Video walls themselves are far from cheap and producers wanting to rent them will have to trade off performance with budget. Even for The Mandalorian’s reported $100 million budget, systems integrator Lux Machina advised a less expensive lower resolution LED for the ceiling portion of the set since these panels were mostly to be used for light reflection and rarely shown in camera.

Advanced LED panels, even the Roe Black Pearl BP2s used on The Mandalorian, may not have the fidelity for capturing extreme close ups and risk displaying moiré patterns.

“We forget that when filmmakers shoot live action it is rarely perfect and they’ve always had to adapt to what they find on the day,” Morin says. “The Mandalorian team wanted to experiment with the system and their expectations were low in terms of final shots they’d get from the wall. They were prepared to replace most pixels in post but knew it would at least give them a template to edit and they wouldn’t need post viz.”

HUC-027199.pip

Favreau: ”A majority of the shots were done completely in-camera”

He adds, “But because they adapted to the conditions on set they ended up with a staggeringly high shot count of files good enough for the final picture.”

In fact, the virtual production workflow was used to film more than half of The Mandalorian season 1, enabling the filmmakers to capture a significant amount of complex VFX shots.

“A majority of the shots were done completely in-camera,” Favreau confirms. “And in cases where we didn’t get to final pixel, the post-production process was shortened significantly because we had already made creative choices based on what we had seen in front of us.”

Potential issues with moiré were ameliorated by shooting with Arri Alexa LF (Large Format) and Panavision full-frame Ultra Vista anamorphic lenses.

“It allows the inherent problems in a 2D screen displaying 3D images to fall off in focus a lot faster,” Fraser explained, “So the eye can’t tell that those buildings that appear to be 1,000 feet away are actually being projected on a 2D screen only 20 feet from the actor.”

Where the latency from the camera position information to Unreal’s rendering became a real issue, they resorted to the conventional green screen.

“We got a tremendous percentage of shots that actually worked in-camera, just with the real-time renders in engine,” Favreau explained at Siggraph.

“For certain types of shots, depending on the focal length and shooting with anamorphic lensing, we could see in camera, the lighting, the interactive light, the layout, the background, the horizon. We didn’t have to mash things together later. Even if we had to up-res or replace them, we had the basis point and all the interactive light.”

Post-production was mostly about refining creative choices that they were not able to finalise as photo-real on the set.

Morin says that as production progressed the team became more comfortable using the technology and began to design with it in ways that are subtly apparent in later episodes.

Season 2 production began in October 2019.