In the third instalment of IBC365’s in-depth look at the 2021 Accelerator programme, those involved in the RT-3D Interactive Content Creation project reveal why and how they are creating transmedia content using XR tools with genuine real-time workflows.

One of the more interesting consequences of the dramatic increases in computing power over recent years, specifically the increase in power of GPUs (Graphics Processing Units), has been the arrival of photorealistic, real-time CG.

IBC Accelerators RT-3D

The POC planned for IBC2021 in December will aim to bring Real-Time 3D (RT-3D) to life

The industry has followed a stratospheric curve as a result, one that has its roots in clunky rendering times for even primitive polygons and ends — currently — with real-time raytracing and being able to generate utterly convincing scenes, animations, and virtual sets on the fly.

But while the 3D output can now be generated in real-time, the tools that we use to create these animations are lagging and are still defined by mouse-based, 2D production pipelines. That is partly why the co-champions from one of 2020’s most popular Accelerator projects, CG Animation Production: New Immersive & Real-Time Workflows, return this year with a new challenge looking at creating transmedia content using cutting-edge XR tools and technology with genuine, real-time workflows.

Champion: Sky (Project Lead), Pixar, Cartoon Network/Warner Media, Unity Technologies, Unreal/Epic Games, RTE, Trinity College Dublin, Fox Sports, Facebook Reality Labs

Participant: Anchorpoint, Noitom, Pink Kong, Trick3D, Masterpiece Studios

  • More information about the Accelerator Media Innovation Programme, supported by Nvidia, is available here

The POC planned for IBC2021 in December will aim to bring RT-3D — Real-Time 3D — to life using a combination of immersive XR software and traditional production tools, with a focus on animation or photo-realistic live action output. And it has scored a bit of a coup as content feeds using the major competing real-time game engines on the market, Unity and Unreal uniquely, both working together again in this Accelerator, will look to show how output can be optimised with time and budget efficiencies that scale over ‘traditional’ methods by using an XR-based pipeline.

“I’m pretty sure we’re the only public project on the planet that has those two folks in the same room together,” comments project lead, Matthew McCartney, head of immersive technology at Sky.

“But the heart of our effort is not about corporate, it’s about the academic side of it and we have created a collaborative space where people can share advances — that is perhaps as important as any outputs that we might see.”

Coping with demand

The pandemic’s main effect on the animation sector has been an unprecedented surge in demand with the result that many of the organisations involved are busier than at any point in their history. As such, the Accelerator’s effort is a timely one.

“A couple of years ago the only way you could create a 3D asset was sitting down on a laptop with your mouse and you draw it, then you have to manipulate it — it’s quite fiddly,” says McCartney.

“Using software like Masterpiece Studio or Tvori, who are both onboard as Participant partners, you put on your VR rig and you’re suddenly sculpting – this ability to create a 3D asset in 3D for 3D output is very new.”

One of the main tasks of the Accelerator is precisely assessing the viability of that pipeline. The effort is split across four creative teams with the plan being that each will create two pieces of content for IBC; a production diary and the content itself.

These will showcase the variety that can be achieved with this new breed of XR production tool as well as reveal potential synergies between the different workflows (which McCartney points out tend to be a common pipeline until they hit a games engine and start being purposed for a specific use).

“We have these very big pipelines that are really good at making feature films and visual effects, but they’re like cruise ships - they’re really good in going in one direction but they’re very hard to manoeuvre,” Matthew McCartney, Sky

RT-3D workflows bring a range of potential benefits to users. Speed and efficiency are two of the most obvious, while they also allow artists to literally be able to walk around their creations and identify any problems easier in the process. They are also far more naturalistic to use. This means that the barriers to entry for new users are minimised, which is an important component in how the animation sector looks to address the surge in demand for content. The learning curve for DCC tools (Digital Content Creation) remains high.

“There’s a million of those little, tiny barriers that you have to learn to get even a simple thing created in a DCC tool,” says Pixar Animation Studios’ Dylan Sisson.

“Using VR, you are able to move the camera with your head. That actually lowers the barrier for a lot of people to get into working in digital art; an illustrator can now be part of a digital pipeline when they couldn’t before.”

This is evolving all the time. New for this year is a test workflow that integrates motion capture into the workflow via Noitom, another new Participant for 2021. This technology allows artists to quickly tie motion into their models for animation purposes; they move their arm, their CG model does the same, and so on.

Empirical evidence

One of the key tasks for the Accelerator is assessing the advantages this confers and putting actual numbers to the benefits an XR workflow can bring.

Grace Dinan is a Viz Artist at Irish broadcaster RTÉ and explains how she is teaming up with participants Pink Kong Studios, as well as fellow Champions, Trinity College Dublin to do a formal user study which will compare the costs, time, and resources used in XR workflows with traditional DCC-driven ones.

“We’re going to do some studies with novices new to animation,” she explains.

“We will give them a simple task, such as creating a snake moving across the floor, add textures, lights, and render it, and measure how easy that is to learn and do. I think that’s where we’re going to get some really interesting results with their XR tools. We’re also going to do a user study with experienced professional animators; assigning them a task such as facial or hand animation and seeing how well they perform it in both methods.”

Alongside this collation of empirical evidence, the Accelerator is also going to look at the concept of fidelity. The new breed of XR tools are more capable than ever — and extending their functionality with every point release — but as yet they are not accomplished enough to deliver the very highest quality of CG on their own.

McCartney reckons they are between 10-5% shy of being the complete article.

“Hair, for example, it’s very difficult to get hair into strands using XR tools,” he says,” so were still going to need integration into more traditional tools to get things over the line. The Accelerator will provide valuable feedback to developers, either indicating where they need to focus their efforts and/or improve those integrations.”

It is also worthwhile noting that its benefits will not only be felt at the high end. While there is undoubtedly an investment that needs to be made in new technologies, as Sisson points out, projects the size of a new Pixar movie follow approximate four-year development cycles. This means it is sometimes easier for smaller, more nimble studios to operate as early adopters.

“We have these very big pipelines that are really good at making feature films and visual effects, but they’re like cruise ships - they’re really good in going in one direction but they’re very hard to manoeuvre,” he says.

“A smaller studio has a better chance of being able to integrate these new tools and do something different.”