Avatar broke technical boundaries and commercial records in 2009 by bringing digital 3D into the light. Can history repeat itself with James Cameron’s four sequels currently in production?

A decade ago, Avatar defied Hollywood skeptics, smashed box office records, catalysed digital 3D cinema and soaked up a quarter of a billion dollars in pioneering virtual production techniques. Momentum is building for more of the same high concept fusion of technology and storytelling from director James Cameron with the upcoming sequels.

The first of four planned sequels budgeted together at $1 billion is not due until December 2021 but the film’s publicity machine has teased more details of the space opera’s suitably extravagant work in progress.

Cameron’s production company Lightstorm Entertainment released concept art in January and, when the lockdown caused a hiatus in the live action production, producer Jon Landau posted an Instagram of the set in Wellington, New Zealand.

Avatar 3x2

Avatar: Sequel prepares to resume production in New Zealand

Now, IBC365 has secured an exclusive interview with Geoff Burdick, senior vice president of Production Services & Technology for Lightstorm Entertainment.

“The technology has advanced leaps and bounds at every conceivable level since the first Avatar,” says Burdick who worked on the original. “We are shooting higher resolution than the first and at higher frame rates and still in 3D. Three years ago, when we were looking for production technology, we knew these were our parameters but not exactly what the solution was.” 

Much of the original team returns, including Weta Digital as the film’s VFX vendor, Joe Letteri as lead VFX supervisor and editors John Refoua and Stephen Rivkin. Russell Carpenter, who shot Cameron’s Titanic, is cinematographer on both Avatar 2 and 3. 

Performance capture of the lead actors - including Kate Winslet, Zoe Saldana, Sam Worthington and child actor Jack Champion - was completed as early as November 2018. This took place on a volume capture stage ringed by cameras recording data in 360-degrees. The last phase of live action photography is being completed now. 

Cameron can use the performance capture data to select any camera angle of the performance he wants and place that into a scene. In parallel, Weta Digital retargets the performance data onto CG characters.  

Avatar 2 Jon Landau ship

Jon Landau: Has been sharing images from the set in New Zealand

Source: Jon Landau via Instagram

Stereo high frame rate 4K imaging  
The 2009 film’s specification was extraordinary enough since it shot digitally in stereo 3D at full HD (on to tape) at a time when most of the industry was still shooting and projecting 35mm. Cameron’s insistence on exhibiting the film in 3D necessitated that cinema owners ditch their old film projectors and encouraged them to buy 3D-capable digital systems which were just being introduced at the outset of digital cinema transition.  

Exhibitors began to do so in numbers once audiences defied sniffy press reviews and made Avatar a must-see phenomenon raking in $2.79 billion worldwide. 

Cameron subsequently embarked on a campaign to persuade the TV industry to make 3D a standard production tool. He partnered with Vince Pace, the cameraman and technician who helped devise the Fusion 3D camera rig for Cameron’s submarine document of the Titanic Ghosts of the Abyss and on Avatar

The director also evangelized the use of high frame rates (HFR) to erase the motion blur wrinkles of century old celluloid filmmaking. At trade shows he demonstrated footage shot at rates as high as 120 frames a second and several times stated his intent to shoot the sequels at higher rates. 

His belief in HFR has cooled since then, in part one suspects, because of the negative reaction and poor box office performance of experiments from director’s Peter Jackson and Ang Lee. 

“I have a personal philosophy around high frame rate, which is that it is a specific solution to specific problems having to do with 3D,” Cameron recently told Collider. “When you get the strobing and the jitter of certain shots that pan or certain lateral movement across frame, it’s distracting in 3D. To me, [HFR is] just a solution for those shots. I don’t think it’s a format. I think it’s a tool to be used to solve problems in 3D projection.” 

Super massive data and Blackmagic 
In 2009, Cameron was able to playback material in 3D HD at 24fps on set. Avatar 2 and 3 are being acquired at 3D 4K, and fed through an on set pipeline at various resolutions and frame rates. These include 3D 48fps in 2K and 4K, 3D 24fps in 2K and 4K, and 3D 24fps in HD. 

In turn, this necessitated viewing feeds of the live action on stages in Wellington, NZ from multiple 3D camera systems, simultaneously. 

“We are shooting stereoscopically from one 3D rig, often two rigs and sometimes three stereo pairs simultaneously and everything is processed instantly,” Burdick explains. “We have Avids on set, Jim has monitors on set to look at 2D and 3D, SDR and 4K HDR, and we have a couple other viewing environments.” 

ENp7lbGUUAAfbU7

Geoff Burdick: “The technology has advanced leaps and bounds at every conceivable level since the first Avatar,”

Source: Officialavatar

These are a screening room adjacent to the stages and a mobile projection pod built into a small trailer housing a Christie Digital 3D projector capable of projecting DCI compliant dailies. It’s moved close to camera so that at any point Cameron can look at it.  

“Massive amounts of data is being pushed around live every minute,” Burdick says. “We needed HFR and high res and everything had to be in 3D. This may not be not the science experiment it was when shooting the first Avatar but the sync for 3D at those higher frames and resolutions is still an issue. Alerting camera to issues is a big part of our job.” 

The playback and monitoring workflow built by Burdick comprises several Blackmagic Design products. “It was very challenging for our engineering team to come up with a signal path to enable this workflow,” he says Burdick. “Not many companies had incorporated this kind of support in their products. Luckily, Blackmagic Design stepped up to the plate.” 

A Teranex AV format convertor supports multiple stereo feeds in 4K at frame beyond 24/fps “all day every day” while a Smart VideoHub 40x40 12G operates at 4K 48fps to route all signals. 

“The technology has advanced leaps and bounds at every conceivable level since the first Avatar,” - Geoff Burdick

Burdick says, “The Teranex converts 4K 48fps to 2K depending on Jim’s creative decision. We have other equipment in the chain and the Teranex has been super critical to enable everything to handshake properly. It is also used to take the signal from the 3D rigs to ensure it is aligned properly.” 

He describes an ATEM 4 M/E Broadcast Studio 4K as invaluable for tracking sync. “We’re not just rates syncing down to one frame but querying and tracking details in the sub-frame. We can use the ATEM to overlay multiple 4K 48 signals to prove the on set video is working correctly.”  

Issues, specifically for the stereo 3D, include ensuring parity between left and right eye lenses, whether an iris is mismatched or the zoom is offset or there are rotational axis issues.  

Jon Landau

Jon Landau: Producer of Avatar and its sequels has been eager to get back to production

Source: Jon Landau via Instagram

“We’ve got all this amazing latitude in terms of contrast ratio and dynamic range and, combined with camera’s optical system, we are just seeing a lot more than we did way back in 2009,” he says. 

“We are not looking at some low-res preview of what we’d see in post down the road on a big screen. We are not looking at this three months’ prior to release to cinemas. We need to look at it now. 

“In effect, we are seeing it in a theatrical environment instantly. We look at every set up, every rehearsal, every take and every feed live as it is shot on-the-fly in 2D and 3D. We are looking at back focus, actual camera focus and lighting. We can see the good with the bad at the point of acquisition and we address issues live.” 

Burdick adds: “You can - and we do - monitor with studio-grade 2D and 3D HDR monitors but it is only when you screen at scale that you can truly see [the picture detail]. There are critical camera adjacent monitors for our DP and focus pullers who are working in converge and dialing in interocular and can be perfect but my small team are see the same feed live and I can radio to Jim that we have ‘X’ issue with the left eye image. Nobody would have seen that without this set up.” 

Blackmagic Design kit was also used as the backbone infrastructure for the production of Disney’s CG remake The Lion King and viewing dailies at DCI compliant levels is not unknown. 

“Our set up is arguably groundbreaking in terms of being able to do what we are doing at this high spec and in stereo,” he says. “It is all to service the director’s vision.” 

Water logged 
The film promises a return to Pandora and an exploration of new parts of the world. This includes extensive sequences filmed underwater. Not just CG water, but actually underwater, with the actors apparently trained to hold their breath for up to four minutes, while wearing performance capture suits and no scuba breathing gear. Naturally, this had never been done before. It took a year and a half to develop a new motion capture system that would work. 

“The problem with water is not the underwater part, but the interface between the air and the water, which forms a moving mirror,” Cameron explained. “That moving mirror reflects all the dots and markers, and it creates a bunch of false markers. It’s a little bit like a fighter plane dumping a bunch of chaff to confuse the radar system of a missile. So, we’ve had to figure out how to get around that problem.  

“Basically, whenever you add water to any problem, it just gets ten times harder. So, we’ve thrown a lot of horsepower, innovation, imagination and new technology at the problem, and it’s taken us about a year and a half now to work out how we’re going to do it.” 

Avatar 2

Avatar 2: Zoe Saldana, Sam Worthington, Kate Winslet and Cliff Curtis pictured at the surface of the 900,000 gallon performance capture tank

Source: Jon Landau via Instagram

Part of the solution involved covering the surface of the tank in small white balls that prevent overhead studio lights from contaminating the performance capture system below… while still allowing anyone below to surface safely through them should the need arise. 

Cameron said: “We’ve got six teenagers and one seven-year-old, and they’re all playing a scene underwater. They’re all perfectly capable of acting underwater, very calmly while holding their breath. And we’re getting really good data, beautiful character motion and great facial performance capture. We’ve basically cracked the code.” 

Principal photography 
Burdick describes Sony as “the first piece of the puzzle”.  Where Avatar was shot on eight Sony HDC-F950 cameras, the sequels are using Sony’s latest CineAlta camera Venice. 

The Sony-Lightstorm collaboration began in 1999, resulting in the development of a unique extension system for the HDC-F950 allowing the camera body to detach from the actual image sensor block. A similar approach has been adopted for Venice, with each sensor and camera body connected by a cable at distances of up to 20 feet. For the Avatar sequels, the only part of the Venice carried on 3D rigs are the image sensor optical blocks, reducing on-board camera weight to about three pounds per sensor block.  

By lowering the weight and improving ergonomics, Cameron and Carpenter can wield the cameras with greater flexibility and freedom. 

The production is also using a variety of additional Sony cameras including multiple Alpha mirrorless interchangeable lens cameras, PXW-Z450 and PXW-X320 camcorders, and the waterproof RX0 camera.  

“[We’re] using Sony cameras for all reference of performance cameras on stage and for timelapse photography,” explains Landau.  

Outside of the live action, Burdick’s work also encompasses all technical aspects of editorial, post and vfx “to make sure everything is working the way it is should.”  

He will supervise every piece of editorial, the DI, and mastering for theatrical exhibition, all the home entertainment grades and creation of all files for every distribution format.  

“We’ve got our toes into all the parts from pre to post that involve how the movie is going to look and be presented.”