Accelerating the animation pipeline without increasing the amount of kit required is the principal aim of the Smart Remote Production For Real-Time Animation Accelerator project, which is bringing together leading broadcasters and vendors to leverage the latest technological developments of markerless motion capture and speech-driven facial animation to drive CG performances in the Unreal real-time render engine.

One of the singular aspects of the IBC Accelerators is the way that they bring together all sides of the industry and enable them to work together in a non-commercially driven environment. Broadcasters get to specify what they want rather than relying on the sometimes technologically deterministic approach of the vendors, while vendors in turn get invaluable feedback that helps them to guide their future development cycles.

Galaxy Dance smart remote production Accelerator 2

Galaxy Dance

The result is a win/win situation on all sides and one that is perfectly illustrated by the work being undertaken on one of the standout projects for IBC2021, Smart Remote Production For Real-Time Animation.

“We don’t usually work directly with vendors, so this is very interesting to us,” says RAI’s Roberto lacoviello succinctly. Indeed, Italy’s RAI is just one of a strong broadcast contingent of Accelerator Champions that also includes RTÉ, VRT, YLE, and the EBU, working alongside the Entertainment Technology Center at USC, Digital Domain, and Unreal Engine developer Epic Games.

With participants including RADiCAL and Respeecher, with guidance from IBC Accelerator supporter Nvidia, this Accelerator represents a powerful conglomeration of talent and expertise that is looking to create the most effective, low-cost pipeline that they can to take material from script to 3D character in a real-time, distributed workflow environment.

Low cost and accessible

One of the key elements to this Accelerator is the phrase ‘low-cost’. As well as testing the feasibility of using vocal performance and body posture to drive 3D avatars from remotely connected locations, its stated aim is to do all this while using minimal equipment. Even more so, according to one of the project leads, RTÉ’s Ultan Courtney, the wish is to do that with technology that exists less in production facilities and more in the pockets of the people that work there and their audiences.

“As opposed to having studios doing volumetric capture and other high-end techniques, we want to look at a workflow that engages millions of people,” he says. “Most households have a smartphone, for example, so we’re working on that basis and what is the most accessible, highest tech we can work with to tell stories and communicate.”

Champions RTÉ, EBU, RAI, VRT, YLE, ETC/USC, Digital Domain, Unreal/Epic Game

Participants Respeecher, RADiCAL

  • More information about the Accelerator Media Innovation Programme, supported by Nvidia, is available here

This changes the technological emphasis of performance capture markedly. As opposed to a high technology solution involving wearing marker-based motion capture suits in a precisely delineated and monitored capture volume, the tech emphasis is shifted to the AI processing of a standard video signal. The AI tracks the performance and converts that into an avatar, mapping limb and facial movements onto a virtual character.

“This is the difference between now and a few years ago; there is a lot of AI technology that can help us at a low budget,” explains Iacoviello. “There are three aspects where AI has made a lot of progress: marker-less capture, emotional response, and speech animation.”

Voice cloning

Speech is one of the areas where this Accelerator is using interesting new tools. Respeecher is software that effectively clones voices using Deep Learning techniques and has already been used on The Mandalorian to create the voice of a young Luke Skywalker and by the NFL to recreate the voice of deceased football coach Vince Lombardi at this year’s Super Bowl.

“In this project the idea is to make one source speaker be able to speak in several different voices for the POC piece,” says company co-founder Alex Serdiuk. “To date we’ve offered a white glove service where we need to be involved, but this uses our new self-serve Voice Marketplace. There are many challenges for us with this as we lose a lot of control over the recording conditions and performance levels of the actors involved, but it is important to be able to democratise these tools so that they can be used in the sort of environments that ideally would require just an iPhone.”

Of course, one of the challenges any Accelerator team faces is that while the tools may be available to make the POC a success, getting them to work together in a designated workflow is not always easy. Sometimes they are lucky and the work has already been done. For instance, once motion has been captured using RADiCAL and uploaded to the cloud, RADiCAL provides an animated FBX file which can be used in both Omniverse and Unreal to retarget the animation to CG characters. in this case, as an illustration, the wrnch CaptureStream iOS-based AI capture software can already export into NVIDIA’s Omniverse realtime simulation and collaboration platform via the wrnch AI Pose Estimator.

“I was really happy to see so many broadcasters involved in this project,” she concludes, “And what I really hope already is that we keep going on this project even after IBC is over. We are learning so many things,” Paola Sunna, EBU

However, other areas of the workflow still need work. Despite Omniverse’s success at becoming a universal solvent for 3D tools, there are always challenges. For example, can the team’s Pose Estimation workflow be easily connected to the speech to facial animation workflow that the team is working on? And can the resulting characters be then imported from Omniverse and integrated into Unreal environments?

Galaxy Dance smart remote production Accelerator cropped

Galaxy Dance smart remote production Accelerator

“We are pushing things forward but not everything is designed to be opened up to other uses. There are a few steps forward and back, and that’s where the Accelerator helps; having a target forces us to crash through all these problems,” says Courtney. “Some parts of the POC are going to work really well, others less so. But if everything worked perfectly no one would learn how to prove anything. That supplies us with invaluable data to go back to the vendors and the broadcasters and work out where the missing pieces are.”

Multiple use cases

As the RAI team, which also includes Alberto Ciprian and Davide Zappia, points out, the TV market is not the only one that will be interested in the concept of realistic real-time character animation. Virtual influencers, for example, become a distinct possibility, while its use in high-end pre-production or even to drive real-time performances in mixed reality LED capture volumes is fairly compelling. “When you go into the digital world the use cases are limited only by your imagination,” says Zappia.

Paola Sunna, Senior Project Manager at EBU Technology & Innovation, is one of the other leads on the project and says that it is on track to deliver its POCs as video clips at IBC. The first will demonstrate the Omniverse-based workflow, the second will integrate everything into Unreal for further realism and flexibility in the broadcast ecosystem and beyond.

“I was really happy to see so many broadcasters involved in this project,” she concludes, “And what I really hope already is that we keep going on this project even after IBC is over. We are learning so many things.”

IBC Digital provides industry insight and the opportunity to engage with exhibitors in the run up to, and during, IBC2021. Register here