LED screens, more commonly found as backdrop to live music acts or as digital signage at sports venues, are now the hottest property in visual effects production.  

rogueone

LED screens: Have advanced beyond just being a lighting source 

Source: Jonathon Olley

The use of video walls in film and TV goes back at least a decade - they were used as a light source projecting onto Sandra Bullock and George Clooney in Gravity. More advanced versions playing pre-rendered sequences were deployed by ILM on Rogue One: A Star Wars Story, and its follow-up Solo and Kenneth Branagh’s 2017 version of Murder on the Orient Express.  

Today, the most sophisticated set-ups combine LED walls and ceilings with camera tracking systems and games engines to render content for playback not only in realtime but in dynamic synchronicity with the camera’s viewpoint. The result allows directors, actors and cinematographers to stage scenes with far greater realism than a green or blue screen and with more chance to make decisions on set. 

“On Gravity we were just using LED as a light source for principal photography but all the pixels were fully replaced in post,” says Richard Graham, CaptureLab supervisor at vfx facility Framestore. “Now we can shoot the screen as though it is a real environment or set extension and almost deliver that as the final image.”  

The use of LEDs as digital backlot forms a vital part of virtual production, the transformative suite of technologies allowing directors, cinematographers and every other department to see and often manipulate in real-time the physical set and actors composited with digital images and creatures. 

“The big change has come with more powerful GPUs combined with games engines providing the software for real-time rendering and ray tracing,” says vfx supervisor Sam Nicholson, ASC, who founded and heads postproduction house Stargate Studios. “When you put that together with LED walls or giant monitors we think that at least 50 per cent of what we do on set can be finished pixels.” 

To make Rogue One in 2014/15 ILM created CG animated backgrounds to populate LED screens that surrounded the set. But the displays at that time didn’t have the fidelity for greater use other than as a lighting source. 

Now the tech has advanced such that pixel pitches (the distance in millimeters from the centre of a pixel to the center of the adjacent pixel) are narrow enough for the images to be photographed. What’s more the panels are capable of greater brightness, higher contrast ratios and showing 10-bit video.  

Games engines – from Epic, Unity and Notch – have also matured, gone mainstream, become easier while GPU processing from Nvidia and AMD has got exponentially faster to enable real-time compositing. 

“In the last year production has become the fastest growing space,” reports Tom Rockhill, chief sales officer at disguise, which makes and markets LED displays and servers for live events and fixed installations. “There’s an inevitability about demand from film and TV.” 

CVD.RWC2019.1011

ITV: Used disguise software to create a LED video wall backdrop for the Rugby World Cup

Rugby’s LED backs 
For ITV Sport’s presentation of the Rugby World Cup from Japan last summer, disguise worked with technical partner Anna Valley to create a three-screen LED video wall backdrop of a Japanese cityscape for the hosts sitting in Maidstone Studios.  The screens responded to on-set camera movements in the same way a physical camera would deliver a panoramic view of a real Japanese city.  

Rockhill explains that to achieve the effect positional data was fed from stYpe RedSpy trackers fixed to the live studio cameras into a disguise gx 2 server running Notch software that rendered the cityscape content from the perspective of the cameras and fed it back to the LED display window in real time. 

“It gave the illusion of perspective so when the camera moves to the left the image moved to the right so it looks like you’re looking out of a window,” he says. “The disguise software translates the physical data from the camera into the virtual realtime environment running on the games engine and pushes the correct pixels to the correct video surface (LED module) out of the disguise hardware.” 

Disguise has made two sales of similar set-ups to UK broadcasters and says the most popular application is sports. 

While film or TV cameras require little modification, camera tracking sensors applied to the camera determines where it is physically and where it would exist in a virtual space. Other vendors here include Mo-Sys and Ncam. 

While live broadcast use of the technology will typically use pre-rendered visuals, for high end dramatic production like The Mandalorian, high resolution video can be output up to 60 frames a second with different lighting set-ups and digital background backplates able to be swapped, tweaked and reconfigured at will. 

“This is aided by the vogue for shooting with large format cameras,” explains Graham. “The pixel pitch is still not narrow enough that the gaps between the pixels aren’t noticeable on certain shots. The solution is to use the shallow depth of field of large format so you blur the background out of focus.” 

Bringing vfx on set helps the crew and cast feel more connected to the end result. Actors can see all the CG elements complete with reflections and lighting integrated in the camera, eliminating the need for green-screen setups. 

“It enables fully immersive production environments, so the actor or on-screen talent doesn’t have to look at a reference monitor or perform in front of green screen,” Rockhill says. 

No more green screen 
The significant value to a DP is that they’re not shooting against green screen and trying to emulate the light that will be comped in later – a loss of control that has been a cause of much angst among cinematographers. With this process, they can realise their creative intent on set, just as it was before the introduction of heavy visual effects. 

“Green screen does still have its uses,” says Graham. “One technical problem with screens is latency. There is a time delay between camera and the image being sent from the render engine. If you move the camera too quickly you will see a lag.” 

Mandalorian

The Mandalorian: Use of VFX on set allows cast and crew to feel more connected to the end result

Though it is only a few frames, even The Mandalorian had to find a workaround. One was to deploy camera language from the original Star Wars which was largely static or with soft pans. Another trick was to render extra pixels around the viewing fustrum [the field of view of a perspective virtual camera] to give themselves a margin for error. 

“If you shoot handheld camera work it would be hard to make the images line up in timely fashion,” notes Graham.  

While good at environmental light, LED displays are less effective at illuminating a hard, bright light source such as strong sunlight. “If the scene requires that you have to bring in some form of external lighting to light the subject correctly,” Graham says. 

Nonetheless, virtual production workflow is removing the boundaries between story, previs, on-set, post viz and postproduction.  

“Essentially it front-loads decision making,” says Graham. “For a long time, live action and vfx has problem solved quite late in the filmmaking process. When you create media for screens well in advance then the key decisions have to be made quite early on with the advantage for director and DP of being able to see close to the final result in-camera rather than waiting months for post to see if their vision has been realized.” 

Fix it in prep 
Salvador Zalvidea, VFX supervisor with Cinesite, says: “Most of the exciting technologies we are seeing emerge will be done in real-time and on set, shifting the visual effects process to preproduction and production. This will allow creatives to make decisions on set. We will probably still require some visual effects to be done or refined after the shoot, but iterations will be much quicker, if not instantaneous.” 

This collapse in timescales, particularly on the back end of projects, is a boon for producers scrambling to turnaround episodic drama. Nor does the show have to have a fantasy or science-fiction storyline. In principle any location could be virtualized from the interior of Buckingham Palace to the exteriors of Chernobyl. 

The technology could also be used as back projection to characters travelling in cars but unlike the century old cinematic technique this time with the ability to reflect accurate lighting from windows and shiny metal. Second units don’t have to leave the studio. 

The screen content itself can be synthetic or photographed on real locations, as photorealistic or exotic as you need. “As long as you plan and design it so it can be rendered successfully from any viewpoint it should be fine,” Graham says. 

While the system is being used on the next Bond No Time To Die, it is also being deployed by Netflix and on HBO’s production of comedy series Run co-created by Fleabag duo Vicky Jones and Phoebe Waller-Bridge. 

The latter uses a digital backlot system designed by Stargate Studios on set in Toronto.  “The challenge is 350 visual effects per episode, 4000 shots in ten weeks,” says Nicholson. “We synchronize it, track it, put it in the Unreal Engine, and it looks real and shouldn’t need any post enhancements. The entire power of a post-production facility like Stargate is moving on set. We now say fix it in prep rather than fix it in post.” 

The financial and creative benefits of virtual production are only just being explored. One of the next key steps is greater integration of cloud for the instant exchange, manipulation and repurposing of data. 

“As vfx companies start to create libraries of photo scanned environments, materials and objects we will get to a point where it’s going to be much easier to create the environments for screens,” says Graham. “This will start to cut down on the amount of prep needed before a shoot. And that means you can be more fluid in the process and allow for more improvisation and more creative iteration closer to the start date.” 

In broadcast, producers are already working with augmented reality to bring graphics to the foreground of static virtual set environments and using extended green screen backgrounds to display graphics rendered out of a games engine. 

“The next step is to add real-time environments running on a surface that the talent can actually see – either by projection or LED – and to combine all the elements together,” says Rockhill. 

LEDs are also emerging with flexible and bendable panels permitting the design of curved and concave shapes outside of the conventional rectangular frame. Disguise’s new studio currently being at its London headquarters will feature curved surfaces to make it easier to blend the edges of a virtual environment. 

“Rather than just a virtual set that looks pretty, we are working to evolve the technology to allow for interactivity with real-time live data,” says White Light’s technical solutions director Andy Hook. “We are also likely to see increased haptic feedback, skeletal tracking, frictionless motion capture – things that allow us to track the people within the virtual space and create more innovative use of the tools and technologies to create more immersive and engaging content.” 

JOKER lead image 16x9 14466

Joker: Used LED screens to ensure a realistic finish to a scene in subway car set

Grounding Joker in reality 
For a pivotal scene in Joker when Arthur Fleck murders three Wall Street bankers on the Gotham City subway DP Lawrence Sher ASC wanted to shoot it as practically as possible. 

One option was to shoot for real on the NYC metro but even if they arranged to shut down the tracks – not easy or cheap – Sher felt the complex logistics for the sequence would be limiting. 

An alternative was to shoot green screen and add the backgrounds in later but this risked losing the ability to light the scene as he wanted while it wouldn’t appear as real for the actors.  

The solution was to build a subway car set, put the actors inside and surround the windows with LED screens displaying the movement of the train. Sher could control the lighting display, switching between flickering fluorescent lights or white subway station, to achieve the heightened realism that he and director Todd Phillips wanted. 

“Suddenly, you’re not guessing where the background is,” he explains to rental houuse PRG whose LED screens and servers were used on the show. “You aren’t coordinating that background later, but you are able to photograph it in real-time and make lighting decisions as you photograph, that you can’t do when you’re shooting blue or green screen. 

“The fact that the airbags were moving a little bit and the world outside was going by, and when the lights flickered off, you can actually see a subway car or station passing by, as opposed to just blue screen, made it seem so real. I’ve talked to people who thought we went out on a subway and just drove a train up and down the tracks.”