Vladimir Putin has been brought to life as a chat show host for a new BBC comedy. James Pearce goes behind the scenes with Framestore to see how the VFX firm used motion capture to create a VFX Putin in real-time.

Vladimir Putin show with June Sarpong

Tonight with Vladimir Putin: June Sarpong guest appears

Source: BBC

Sunday saw the BBC air the pilot of a new comedic chat show covering politics and current affairs, but the standout part of the show was the host: Vladimir Putin.

Not Vladimir Putin, Russian President, but a live VFX version of the former KGB agent created using cutting edge motion capture technology.

The two pilot episodes of Tonight with Vladimir Putin- filmed in front of a live audience, with guests including former Labour spin doctor Alastair Campbell and Guilty Feminist podcast host Deborah Frances-White.

The guests – and audience – could see a live version of the host, who was being filmed using motion capture technology behind a screen – something that, according to Framestore, one of the companies behind the production, is a first.

“Making a broadcast television show of this kind, in this way, had never been done before,” explains Simon Whalley, executive producer for creative development at Framestore. “We filmed it as you would film a broadcast TV chat show. All the CG was being composited in real-time - all the footage we shot through the cameras was what we used, so there was no post-production.”

Concept
The concept arose, like many ideas, from a session in the pub, where Poke co-founder Jasper Gibson and animator Joel Veitch were bemoaning the lack of UK political satire shows such as Spitting Image which aired on ITV in the 1980s and 1990s. They approached Whalley with the idea of creating an animated character who would interview guests in real-time.

That was in 2014. With Framestore, they used emerging real-time technology to create a test 2D Putin, who interviewed guests in real-time using motion capture. This was then broadcast on YouTube in order to demonstrate the nascent technology.

“It was very shoestring,” adds Whalley, “but it demonstrated that the concept of a real person being interviewed by a cartoon in real-time actually worked.”

After running further tests in order to scale up the technology and create a 3D model, the team approached Phil McIntyre Television to produce a show, with BBC Two commissioning a pilot in January.

Framestore pupeteering for vladimir putin

Pupeteering: Performer as Vladimir Putin

Source: Framestore

The team were given until the end of March of this year to develop the pilots, which included building a brand-new model of the animated Putin, and all the relevant assets, ahead of filming.

Richard Graham, CaptureLab supervisor at Framestore, explains: “The assets were built in five weeks and me and my team prepped it in two and a half weeks.

That brought mostly logistical challenges, he adds such as getting access to the performers early enough to do practice sessions.

“Essentially, they are doing puppeteering. They need to learn the character and what their face is doing to make it work, but we also need to take footage captured of their faces and programme it into our tracking models, so there is a back and forth between us and the performers in tuning the character’s output. Trying to do that in two or three weeks was challenging.”

Though the pilots didn’t air until Sunday, Whalley and his team still opted to film as if it was being broadcast live, including a live studio audience.

“We wanted it to be very immediate, so that’s why we used real-time techniques,” he explains.

“With social media and rolling news, everything is happening immediately, and we wanted to be able to fit into that. If we had these tools, we could shoot it quickly. It’s like how the Americans shoot chat shows with immediacy. We wanted it to be as topical and as relevant as we can possibly make it. Usually animation has a very long lead time.”

Unreal
The mocap was shot using a Vicon motion capture system, with Graham’s team utilising an original T-series system to capture Natt Tapley’s performance as Putin. Tapley was wearing a mo-cap suit with 75 sensors on it, and this linked to several computers which instantly processed his movements.

To cut latency, Graham and his team hardwired the kit to its processing computers. Putin was then animated using Epic’s Unreal Engine and fed back to a screen which showed the animated performance to guests and the audience. This was done with less than three frames of delay.

Graham said: “We’re able to do it at 60 frames per second but broadcasters don’t want that – they wanted 25 fps. So, we had to slow everything down, but because we have to do everything in whole frames, there is a 2-3 frame delay, which is only really visible on the lip sync for the character. But there is a lot we can do to improve this and that is something we’re working on.”

Mocap motion capture

Mocap: Shot using Vicon motion capture system

Source: Framestore

The mocap shots also saw Graham use a package called Dynamixyz for facial capture.

A secondary set of Vicon Vero cameras which were suspended above the audience. These were used to track the main cameras as they moved around the rostrum, ensuring that the compositing still lined up when the shots were produced.

Overall, four different shots were taken: one of the main set, including Putin and the guest, one of Putin’s desk and another, wider shot of the stage. And finally, one just of Putin, which was entirely animated, as the guest would never appear in that shot.

Whalley said: “This has never been done this way before. The buzz around doing real-time digital characters even after the press release has been notable.”

The buzz
Not all the buzz has been positive. Since the pilots aired, reception on social media has been mixed, with the show coming in for some criticism for a surprise appearance by an animated Meghan Markle.

Vladimir Putin portrait

Animated: Vladimir Putin 

Source: BBC

Prior to release, the show also received a response from the Kremlin itself. Kremlin spokesman Dmitry Peskov said Putin (the real one) had “no plans” to watch it.

“Many books have been written about Putin, and there have been puppets and caricatures too. The president has not read books about himself, and has not watched the cartoons either,” he said. “Putin doesn’t want to imitate the caricatures; let the caricatures imitate him.”

Whalley admits surprise at the Kremlin’s decision to comment on the show but explains the reasons behind picking the Russian President as host of the show.

“Putin brings longevity – while other politicians leave the scene, Putin always sticks around. A cartoon Putin also gives us the ability to express a different view, one from the other side. Animated Putin can voice opinions and viewpoints that no real host could have or voice. As a global political superstar and as the ultimate outsider, he can act as a window into ourselves. Although Putin himself is ripe for satire, the focus is actually on our guests and ourselves, and how Putin can hold a mirror up to 21st century Britain. That gives it a unique voice.”

Wider applications
The production opens several possible options for Framestore and for future use cases. One advantage is that producing live animated content means you can have animated characters acting with the immediacy normally reserved for live actors.

Whalley says: “We wanted it to be very immediate, so that’s why we used real-time techniques. With social media and rolling news, everything is happening immediately, and we wanted to be able to fit into that. If we had these tools, we could shoot it quickly. Usually animation has a long lead time.”

Tonight with Vladimir Putin credit BBC

VR technology: ”There are many more potential use cases”

Source: BBC

Not so with Tonight… which can be turned around very quickly due to the lack of post-production. Pre-production, however, takes some time, so setting it up as a portable solution might not be viable, says Graham. Rather, it suits either set-up in a permanent studio or quick productions that can be planned in advance.

Adds Whalley: “One of the interesting things about it is the ability to produce quickly. You have a bigger up-front development, more preparation, but once that is set up, the technology is there. It also results in the need to develop a new puppeteering style. We want to see how you can push that and make it more responsive. This could be a whole new puppeteering art form that is being developed.”

This could see the technology applied in other environments, such as kids’ TV shows, or in theme parks, where augmented and virtual reality technologies are already having an impact.

Whalley said: “You can imagine that beyond TV and entertainment, there are so many other potential use cases, including theme parks or reactive characters. We’re paving the way to that by developing our workflow, developing our tools, and bringing bits of the company together, which is very rewarding.”