Interactive drama, games engines, holograms, aerial spectaculars and digital cosmetics showed the blend of content and tech innovation has never been stronger

Black Mirror: Bandersnatch (which released December 28 2018) was heralded as a landmark achievement not just for Netflix but for the future of interactive programming. The script for the SVoD’s first interactive content for adults ran to 157 pages, divided into eight sections and 250 segments, generating nearly five hours of footage and more than a trillion unique permutations of the story. The episode won two Emmys, including for Outstanding TV Movie.

Bandersnatch source Netflix

Bandersnatch heralded a landmark for interactive programming

Source: Netflix

“We developed special technology to enable audience interaction but that had to work in concert with our engineering and product teams and with editorial and post,” Sean Cooney, Netflix director, Worldwide Post Production told IBC365.

“This involved multiple departments working to create a unique end-to-end experience and figuring out how the branching of storylines can work. It meant the user interface guys talking with [writer Charlie Brooker] about the script and with every department in between that touches the content.”

The smartest move Brooker and Co. made was to contextualise the story about a path-choosing adventure game. The importance of this meta-narrative, which included a conspiracy theory about being part of a Netflix episode, continues to grow with subsequent less successful attempts such as Bear Grylls’ factual series You vs Wild. Netflix experiments with genre include an interactive special of Unbreakable Kimmy Schmidt due next year.

The BBC also made strides towards a personalised broadcast experience, launching an interactive version of its online technology show Click in July. Later in the year it showed off latest developments in object-based media which has the potential to become the de facto way we will experience personalised programming.

One scenario sketched out by BBC R&D is being able to direct the plot of His Dark Materials as it unfolds. This is possible using the digital assets already created for the series reassembled on the client’s device either by the computational capability of the device itself, using cloud servers or servers at the edge of the network.

“In a fully virtualised scene which has been produced with digital puppeteering and CGI it will be possible for the viewer to place themselves in the middle of a battle scene with a daemon of their own creation and see all around them,” explains Rajiv Ramdhany, project R&D Engineer. “This is the holy grail in terms of object based interactive narrative.”

VR the new filmmaking tool
Consumer VR may have had a reality check in 2019 (VR producer Jaunt XR was one casualty) but its use as a collaborative production tool continued to advance.

Three films exemplify the approach. In John Wick Chapter 3 – Parabellum, cinematographer Dan Lausten used VR to light a climactic set-piece. By viewing a virtual 3D version of the room before the set was actually constructed he could block the scene, select lenses and even stage fight sequences.

Toy Story 4 was the first Pixar animation to deploy VR to aid in design and location planning. “Sets aren’t static or two dimensional but immersive. We can walk around it,” director of photography Patrick Lin told IBC365. “We can trial lenses. We can add or remove props. We were able to visualise scale between human and toy perspectives. Being able to bring a director into the space to get a sense of a scene is a huge advantage.”

The Lion King - Virtual Reality headset

Scenes for The Lion King were mapped using a HTC Vive VR headset

The year’s biggest boundary pusher and strong Oscar contender is The Lion King. Director Jon Favreau and cinematographer Caleb Deschanel were able to closely approximate the experience of shooting a live action movie within a VR environment. This included loading highly detailed CG landscapes of African landscapes, rendered in Unreal Engine, to create a virtual set.

The production even created physical simulations of Steadicams, cranes and dollies in order for Deschanel to ‘feel’ feedback as he moved cameras in the virtual world.

“VR allows you to walk around the CG world like you would on a real set and put the camera where you want,” VFX supervisor Rob Legato explained to IBC365. “The physicality of it helps you to psychologically root yourself. You know where the light goes, you know where the camera goes, you know where the actors are – and all of a sudden you start doing very natural camera work because you’re in an environment that you’re familiar with.”

MPC, the VFX facility behind the film, has adapted the technologies into a virtual production platform called Genesis.

“Industry advances in this area are collapsing the timeline between pre-visualisation and post,” says Graham Bell from MPC R&D. “The idea of seeing an image rendered from a real-time technology making it to the final frame would have been improbable a few years ago.”

Describing it as a “multiplayer VR filmmaking game” Favreau went on to use the techniques to produce The Mandalorian, for Disney+ telling The Hollywood Reporter: “It allows us to make the movie together virtually before we ever shoot the episodes.”

Holograms materialise
Satirised by Charlie Brooker in Black Mirror series 5 episode ‘Rachel, Jack and Ashley Too’, the use of holograms to simulate the performance of artists alive or dead came to the fore in 2019 and is likely to prove less controversial in time.

In June, the hologram of a guitarist performing remotely in a studio was beamed live to a stage in Bucharest alongside the live ‘physical’ presence of rock band Vita de Vie. The event was a showcase for Vodafone Romania and was live streamed on Facebook but what made this noteworthy – and a world first – was the use of 5G connectivity to cut the cost, technical restraints and setup time involved.

Dejero provided mobile transmitters and receivers to deliver video links from the studio over a Huawei router and Vodafone’s network. Musion 3D projected the hologram on stage. The quality of the video stream (no stuttering degradation like the Elvis hologram depicted in Blade Runner 2049) and 5G’s low latency, enabling the remote guitarist to interact with band mates and audience, will throw open applications in live entertainment and theatre.

Shortlisted for an IBC Innovation Award for content creation, it won an IABM BAM Award at IBC2019.

In another first, last month, Brazilian singer Laura Rizzotto released her new single accompanied by ‘holographic’ video of her performing it. Fans can view her in augmented reality anywhere they point their phone.

Rizzotto’s hologram was pre-canned in a volumetric capture studio in LA. Nonetheless, as high capacity broadband networks (both 5G and fibre) are built out and next-gen compression technologies like MPEG’s VVC (Versatile Video Coding) come on stream, the opportunity for content producers to stage live mixed reality shows with holograms of actors or artists begins to materialise.

A cautionary note is the failure of Red Digital Cinema’s Hydrogen smart phone, pulled from development in October. Intended as part of a wider holographic capture system, the device’s holographic display and cost were negatively received but it may go just down in history as being premature.

Games engines: Bringing the virtual set to life
Real-time virtual set production has begun to deliver eye-catching presentations for live on-air broadcast.

“What really makes the difference for viewers is being unable to tell whether the images they are watching are real videos or digital renders,” says Jesus Sierra, marketing executive for Brainstorm. “Creating such realism requires the background scenes to have an extremely high render quality, which can only be achieved using hyper realistic rendering technology and carefully designed sets to create photorealistic images.”

Fox Sports’ virtual studio in North Carolina which launched in February, is among the brightest. Straight into action for coverage of Daytona Speedweeks and NASCAR, the photorealistic capabilities meant that Fox could appear to lift a race car through the floor and bring an actual location to the studio – creative executions which made it an IBC Innovation Awards finalist.

MOTD VR Studio 1

MOTD VR Studio 1

Among the benefits of the technology is the ability to pop-up a full virtual studio in a day. Match of the Day has used Dock10’s games-engine driven virtual studio since the start of the 2019-20 soccer season, shared with Blue Peter during the week. Dock10 has recruited developers from the games industry to build sets in Epic Games’ Unreal Engine 4.

“Why go to the time and expense of building a physical version of Downing Street for an election special when you can buy a photoreal asset of No.10 online for US200?” says Andy Waters, Head of Studios at Dock10, who says the facility tested this idea.

One huge improvement this year has been the addition of real-time ray tracing using powerful graphics renderers like NVIDIA RTX cards. Zero Density uses these in the Fox Sports solution to improve ray tracing, for example to improve how video screens on virtual sets can realistically reflect onto other relevant parts of the set.

Brainstorm pairs UE4 with its InfinitySet to integrate elements in the final scene that are alien to the game engine, like 3D data-driven motion graphics, charts, lower-thirds and tickers.

“Now it is possible to create scenes that are indistinguishable from reality, which provides excellent new possibilities for enhancing storytelling, and this seems to be the area with more room for improvement,” says Sierra.

Drones bring wildlife closer
Natural history filmmakers are now capturing wildlife in 8K. Bristol’s Icon Films shot feature length documentary Okavango for NHK in Botswana on Red cameras at 8K; BBC NHU also acquired the bulk of Seven Worlds, One Planet at 7K and 8K using Red Helium sensors for a 4K UHD HDR deliverable.

It is the advance of drone technology which has had arguably more impact.

“We hit a perfect storm making Seven Worlds,” says Colin Jackson, senior innovation producer, NHU. “On Planet Earth 2 we used Octocopters or Hexocopters and high-end cameras but these were heavy with restrictive fight times, complex to set up and needing very skilled pilots. It means you could only make a certain type of shot with them.”

For Seven Worlds, teams in the field were able to use DJI Inspire 2 drones with Zenmuse X5S 4K and Super35mm X7 cameras as well as bespoke UAVs with low light cameras. The lighter weight and greater simplicity of operation meant faster turnaround times and flights lasting up to 25 minutes.

“The speed these drones could be deployed in the air was often key to capturing the behaviour we were after and bringing a fresh perspective to the series,” he says. “Smaller drones are easier to negotiate the use of in foreign territories. We believe we were the first foreign company to be given official permission to film with drones around Lake Malawi.”

UAV filming is set to benefit from even greater image quality matched with longer focal length lenses and longer life battery technology using hydrogen fuel cells.

Among the never before seen sequences captured by drone for the series were hyenas in the Namib desert; a mass shark aggregation filmed off the Australian coast; and a lynx hunting a Snowshoe hare in North America.

Rise of the digital actor
Martin Scorcese’s The Irishman will go head to head with Gemini Man and Avengers: Endgame for a Best VFX Oscar after each advanced the art of artificially de-aging actors.

The elixir of youth is far from new – Brad Pitt, Kurt Russell, Ian McKellan, Colin Firth, Jeff Bridges, Michelle Pfeiffer – have all had it applied but the results have often been derided for failing to bridge the uncanny valley dividing believable from fake.

Gemini Man Will Smith

Will Smith’s face was captured and de-aged for Gemini Man

That’s changed now that such techniques can carry an entire movie.

Roughly 200 shots in Avengers: Endgame involved de-aging Michael Douglas or aging Chris Evans for which facility Lola used a mix of make-up and VFX compositing. Lola was also behind the digital cosmetics on Samuel Jackson as Nick Fury in Captain Marvel using scans of the actor’s face and appearances Pulp Fiction to digitally re-sculpt whilst retaining the emotional intent of his performance.

Weta created a full body digital replica of 23-year old Will Smith in Gemini Man where his ‘junior’ clone had to act alongside the 51-year old real deal. What’s more it had to be achieved at such fidelity that when played back through the looking glass of 4K 120 frames a second the audience would sustain their suspension of disbelief.

To render septuagenarian screen legends Robert De Niro, Joe Pesci and Al Pacino several decades younger in The Irishman, DP Rodrigo Prieto devised a rig comprised of two ‘witness’ cameras either side of the main frame. This allowed the actors to perform naturally without wearing any obvious markers with the facial data undergoing extensive treatment at ILM. The actors’ voices were also doctored, by shifting pitch and removing intakes of breath, to reflect their younger selves.

LA facility Gradient Effects is touting its AI-driven tool Shapeshifter as a way for TV producers to de-age performers in a fraction of the normal time and cost. An episode of HBO comedy The Righteous Gemstones was a case in point when actor John Goodman was made to appear almost as he looked in the heyday of Rosanne.