With a roving brief to explore the nexus of technology and art, Roy C Anthony is revelling in his new role as global head of research at DNEG.
“I’ve always seen technology and art as interdependent,” says Roy C Anthony. “One inspires the other, which unlocks creative potential which inspires further innovation. It’s a virtuous circle.”
Anthony’s primary focus is on emerging technologies, driving innovation within the domains of real-time technology and the use of artificial intelligence to enhance the artist experience across visual effects and animation. He holds patents in stereoscopy, VR /AR display systems and calibration.
His previous job at Ventuz Technology saw him working on real-time graphics in broadcast and live events, such as in-camera AR and camera tracking on virtual sets. Prior to that he headed up the research and innovation team at projection systems vendor Christie Digital “working on tech solutions that could be reframed or presented in a different way”, he says, “like a subversive marketing group”.
His team collaborated with directors and high frame rate (HFR) pioneers Douglas Trumbull, Ang Lee and James Cameron and created some seminal research work around the perception of HFR from an audience perspective.
Despite these efforts, high frame rates have yet to capture the cinema-goers’ imagination.
“High frame rates are just another tool in the tool box along with stereo 3D and wide colour gamut for content creators,” he insists. “If cinematographers only had a 50mm lens then whole styles of shot simply wouldn’t exist. Technology is changing all the time and HFR’s time will come.”
Themed Week: Crafting Content IBC365’s content creation-focused week explores the many aspects of making great film and TV. Click here for more.
Anthony grew up in the late 1970s with video games as much an influence as cinema. The narrative experience of the Gen Z/millennial generation is being shaped as much by gaming and 3D experiences in VR as by stories told linearly on a giant 2D screen.
“Storytelling in computer games doesn’t have a cadence locked to a specific frame rate,” he says. “There is no 24fps high motion blur narrative experience for The Last of Us Part II [a single player PS4 game set in post-apocalyptic America released last year]. It is presented as fast as it needs to be in order to immerse you in the story.”
He says storytellers are getting used to using other mediums like gaming and VR to communicate their ideas. “Consumers have the opportunity in some cases to become the cinematographer and move around and create their own perspective and their own experience.
“It’s also a challenge for content creators. How do you merge a directed experience inside of an environment where people are able to move? Game designers are doing it all the time with subtle visual cues to try to pull your focus towards an area and then reward you when you start going in the correct direction. There are ways of integrating the user inside of an environment in a meaningful way. There are so many ways to hack our wetware [brain] and integrate us into story environments that don’t exist in traditional presentation.”
Cloud community collaboration
In terms of production, an area of focus for Anthony is around cloud and how the opportunity to stream content securely from location to location offers producers a more integrated pipeline.
“It’s one where more individuals can contribute at lower cost to the final solution,” he says. “You don’t necessarily have to be on location to see dailies, you can have them streamed to where you are. The idea of having to ship a hard drive with someone sitting on a plane seat can be consigned to history.”
DNEG is a great example of a company with a number of facilities around the world and contributors for projects who could be based anywhere.
“We now have an opportunity to have our lighting expert in Mumbai collaborate live with colleagues who are on a virtual set in Vancouver. A virtual art department can create a previz with a customer who can participate in that remotely through their mobile device,” he enthuses.
“These kinds of technologies are enabling a lot more fusion and shrinking the size of the global studio to more of a community-oriented environment where we’re more like a little city together even while we are distributed globally.”
AI as production assistant
Another emerging technology showing genuine promise for both the creative process and the bottom line is artificial intelligence. Anthony is bullish on its prospects at the heart of production.
“There are a lot of things that AI offers that enhances the artist experience. When we think about AI as an assistive directed technology it is providing you with an opportunity to express your creativity with less friction.
“Let’s say you want to create a matte painting of an epic backdrop and have a bunch of samples that you need to integrate into Photoshop. You can do that today with tools that allow you to take disparate inputs and output a harmonised composition. It might not be your end point but it will get you to a higher level starting point extremely rapidly.”
He points to research from the University of Toronto, near where he lives, where audio as an input is being used to drive character animation for multiple languages.
- Watch: AI powered post-production
“The goal is not to replace the animator but to get it to the point where the animator can bring it to the next level,” he argues. “AI has a lot of potential to help express our creative potential by simplifying a lot of frustrating tasks and accelerating work and enabling artists to get that aha! moment as quickly as possible.”
The biggest buzzwords in film and TV production just now are virtual production. DNEG itself has a new joint venture with Dimension, which provides custom-built LED stages and is finalising plans for permanent virtual production stages in the UK and North America.
“AI has a lot of potential to help express our creative potential by simplifying a lot of frustrating tasks and accelerating work and enabling artists to get that aha! moment as quickly as possible,” Roy C Anthony
The studio also recently unveiled the Polymotion Stage with Nikon and MRMC. This mobile multi-solution studio is capable of capturing high-quality volumetric video, image and avatar creation at up to 4K with integrated motion capture.
“Virtual production is not just the on-set experience with an LED volume,” he says. “It is a whole continuum of technologies and while top of the range LED volumes are expensive, quite a few other virtual production technologies are democratised.”
Virtual production also incorporates previz, much of which is now performed in VR or it could be using an iPad as a virtual camera for plotting and planning. While this concertinas the traditionally order of filmmaking, Anthony doesn’t think it spells the end of post.
“I still think there’s going to be post VFX just as VFX didn’t eliminate SFX,” he says. “But planning for an LED volume shoot definitely has to be more front loaded. Even on traditional shoots the idea of doing detailed planning ahead of time was important to get the shot you want to achieve. Being able to put so much effort into pre-viz and planning helps you make a better film technically.”
“I still think there’s going to be post VFX just as VFX didn’t eliminate SFX, but planning for an LED volume shoot definitely has to be more front loaded,” Roy C Anthony
Instead of costly guesswork on set to be fixed in post, virtual productions necessitate switching to a ‘fix it in prep’ approach in which post resources are reallocated to the pre-production phase.
“Before we had massive chroma key greenscreen effects for replacing entire environments, production was done for decades by dressing a set and putting actors in it. Virtual production is really no different. It is just providing an opportunity for VFX to contribute to that traditional workflow.”