Framestore explains how it used machine learning in an anti-knife campaign to bring back Kiyan Prince, the schoolboy football prodigy who was tragically killed 15 years ago.

Many dozens of London teenagers have fallen victim to knife-inflicted homicide since the exciting soccer talent Kiyan Prince was killed in 2006 while trying to protect a child from bullying.

Kiyan was 15 then but has been brought back to virtual life aged 30 in a powerful, history-making anti-knife campaign enabled by his campaigning father Mark and deeply committed creative teams from Engine Creative and Framestore.

Long Live the Prince represents a positive use of deepfake that could trigger many more applications that do not cause suspicion. It sees Kiyan sign for his former club, Queens Park Rangers, as well as being introduced as a playable character in EA Sports’ FIFA 21 game.

Framestore Kiyan Prince deepfake Scalf_layers_HD

Framestore: Leaned on machine learning 

To describe how a sensitive process came to fruition after two years of intent, Framestore project lead Karl Woolley started with the pre-script ambition.

“Katie Farmer, Engine head of project management, talked to me about the campaign and Kiyan, and the Kiyan Prince Foundation that Mark fronts. They had always wanted to put Kiyan and his skills into a big game to raise awareness among youth audiences, but at that time FIFA was not ready to do this,” he says.

However, the project sprang to life in January when FIFA said yes to inserting Kiyan as a famed striker, and Engine creative directors Richard Nott and David Dearlove asked Framestore to create Kiyan at 30, plus billboard images.

“David and Richard wanted to show me a little film. They said it is all very well putting Kiyan in the game, but we needed to be able to speak to the audience and educate them with the message. What better way than having a film with Kiyan in it?

“I said ‘whoa, whoa. We didn’t plan to have a walking, talking Kiyan’. But we had to find a solution to do exactly that; deepfake was not a solution at first because you need thousands of source images to train the machine,” he adds. “We did not have thousands of images of Kiyan, but we always had plans A, B and C in our back pocket.”

Run that backwards

One of the crucial enablers was Professor Hassan Ugail of Bradford University, with his aging projection software built around machine learning.

“Part of his research is machine learning. Within a certain level of accuracy, he is able to predict what a missing person might look like now and he is famous for [creating pictures of] Ben Needham [who disappeared at the age of 21 months],” says Woolley. “Hassan has a unique and sophisticated search engine and we asked what Kiyan should look like.”

Ugail took images of Mark, Kiyan and his siblings, and produced a 30-year-old. He was able to run that image backwards to establish a percentage match.

“Anything 70% and above is effectively a 100% match because it would be impossible for it to be anyone else. That image was our scientific device,” explains Woolley.

All the prep work image-wise was done with Photoshop. “The base sculpt was brought into Maya and given a very basic groom of the hair, and that bust was positioned to match the position of the stand-in’s head in the shots and to replicate the lighting set up. That was rendered out, composited into the photographic plate in Photoshop, and Chris Scalf, the likeness artist, did the painting on top,” he adds.

Zero budget

The end game was Maya, Houdini and Blender and the new CopyCat mode in Nuke, which was used for logo removal and some facial work prior to rendering.

“We have created body suits for George Clooney in Gravity and many digital humans, including all of the Marvel characters, but it is really hard to recreate a digital human. The last person we brought back to life was Audrey Hepburn for a Galaxy commercial some years ago. That took six months to do a handful of shots for, and it was both extremely challenging technically and expensive,” says Woolley.

Karl_Woolley crop

Karl Woolley: Framestore lead

“The challenge was to do the Kiyan job within a minimal amount of time and for zero budget. We did not wish to spend any money because we wanted every single pound that would have been spent to go into the campaign. We had to lean on machine learning,” he adds. “Advertising is advertising and film is entertainment. It does not mean anything like the way that this does, but it is not to diminish how important all the other work is.”

The digi-double route was considered. “You can put a digital human anywhere you could not put a real one, and you can create likenesses which are indistinguishable from reality, but it takes months to do that,” says Woolley. “We could not go down that route for financial and timing reasons: we only had a couple of shots to do at the end of the film so we felt we could pull it off the way we did with deepfake technology.”

“Working with super brands we can help to educate and showcase that deepfake can be used for good. There are further applications to come,” Karl Woolley, Framestore

People are wary of deepfakes and the use of the technology for gimmicks. Woolley though was already working on another project where it was a potential route for “other good things”.

“The problem is that the first time you use it, it can be quite expensive and it is not easy to adapt. Working with super brands we can help to educate and showcase that deepfake can be used for good. There are further applications to come,” he says.

Creating synthetic data

So, would Synthetic Media be a better name for deepfake?

“When we did The Alternative Queen’s Speech, we trained the deepfake on many hundreds of images, but for Kiyan we did not have that. There is, though, a case where if you want a machine to perform a function for you, you have to train and educate it, just like you do with ML,” says Woolley.

“So, when you do not have enough of the right kind of data you create synthetic data to train the machine. Then you validate that machine’s output to see if it gives you the likeness or representation you expect. That could be cars, humans, anything,” he adds. “There is this notion in the industry that the CG companies are creating synthetic data to train machines.”

Read more Aida: AI-powered media generation

Woolley understands the nervousness around the moral implications of using deepfake technology, but falls back on the fact that the industry mantra remains ‘tool for the job’.

“In this case, given the time restraints, deepfake technology was the best tool for the job – the only one we could have used actually. It would have been disingenuous to have just used a stand-in or a body double. If we screw up on a shot in a Hollywood film, we have got 600 people on that show to fix it. If this hadn’t worked, letting Mark and his family down after everything they have been through was not an option,” says Woolley.

“In this case, given the time restraints, deepfake technology was the best tool for the job – the only one we could have used actually,” Karl Woolley, Framestore

“My deadline was 11:00 on the Friday before the campaign went live, and I did not review the shots until 10:00. I uploaded to the FTP server at 10:59. We didn’t know it was going to work before that very minute it went live. We did this project because we wanted to help the family,” he adds.

Deep_fake_shot2_187 Kiyan Prince

Kiyan Prince: QPR player

Other credits go to Engine project director Seb Roskell, producer Debbie Impett, editor Sam Hopkins, and motion graphics designer Jamie Thodeson. Framestore also fielded global head of creatures Marcus Schmidt, senior designer David Lockhead, and chief creative officer Mike McGee. Johannes Saam, Framestore’s credited creative technologist, also involved Matt Hermans of the Electric Lens Company in Sydney.

Read more Creating superheroes with on-demand cloud connectivity