The cinematic trailer for Rebellion Games’ Evil Genius 2 has all the humour and production values of a Disney Pixar movie, but the way it was created couldn’t be further from the traditional studio workflow. Michael Burns goes behind the scenes to find out how world domination was planned and delivered remotely during lockdown. 

eg2_stills_1.4.1

Evil Genius 2 Trailer: Produced entirely remotely by Rebellion VFX

Evil Genius 2 is a satirical spy-fi lair builder game from Rebellion, where players take control of one of four super-villains and set their plans for world domination in motion.  

The recently released cinematic trailer for the game was the first project by the UK company’s Rebellion VFX studio and was directed by award-winning filmmaker Hasraf ‘HaZ’ Dulull using a combination of real-time techniques, motion capture and traditional CG animation pipelines – and it was all delivered during a Covid lockdown. 

The trailer opens on an island paradise, with a mega-yacht moored in a private harbour. The camera follows a suave dude strolling through his casino, but all is not as it seems, as we find out when a secret lair is revealed.  

Remote working  
“Evil Genius was the first project we worked through entirely remotely,” says He Sun, the veteran VFX artist and art director who heads up Rebellion VFX. “It’s been one of the most collaborative and creative processes that I’ve ever had.” 

It’s not the first remote rodeo for Dulull, who has worked on several projects with cloud-based pipelines with his crew at HazFilm, as well as having extensive experience of traditional studio workflows. “Rebellion was able to set up everything remotely super quick, including the secure server and the secure laptop they sent over,” he recalls. 

“Evil Genius was the first project we worked through entirely remotely. It’s been one of the most collaborative and creative processes that I’ve ever had,” He Sun, Rebellion VFX 

As it was the first project for the studio, Rebellion VFX could build the CG pipeline for the trailer from scratch. He Sun chose to base this around Universal Scene Description (USD), an extensible, open-source 3D scene description and file format developed by Pixar. SideFX Houdini is used for the modelling and CG simulations/effects, as well as Autodesk Maya, with rendering for the most part taking place in Redshift, with Nuke for compositing. DaVinci Resolve from Blackmagic Design is used for editing and grading at Rebellion, while Epic Games’ Unreal Engine also played a big part in this project. 

Collaborative CG 
Work started on the trailer while the game itself was already in late development. The previsualisation (previs) was created by Dulull using Unreal Engine. “I would take the default CG mannequin in Unreal. I’d animate that, block scenes out, put in an edit, send it over to Sun who would put in some references as well and we would pitch these ideas to the top level management.” 

WhatsApp Image 2020-06-16 at 12.46.25

Communication was constant, not only using Shotgun for reviews, but also tools like Whatsapp

Once previs was locked, communication was constant, not only using Shotgun for reviews, but also tools like WhatsApp. 

“HaZ is such a responsive guy,” says Sun. “The team became so used to the way that he was directing and supervising the project. He’s a real-time director, so that was a huge benefit. We gave him scenes to play around with his virtual camera [in Unreal]. He just sent us the file back and then we could see exactly what he wanted.” 

“I was using Unreal Engine to do the previs anyway, so it just made sense for the guys to send me over their amazing environments and I’d block out all the camera [moves],” Dulull explains. “It gave me time to experiment, and when I sent files back to Sun and the team, they had a camera that was signed off from the director.” 

“It definitely cut down a lot of time, and on understanding what HaZ wanted,” Sun adds. “Remote working is a difficult thing, [when] you can’t have HaZ sitting next to that artist. This workflow really nailed it.” 

Asset building  
When building a cinematic trailer for a game, it’s not enough to just take game assets and plonk them into a CG animation. “Game assets are built in modular ways and then you use code to put them together to make it work,” says Dulull. “Striving for high quality in the cinematic, we knew game assets wouldn’t work as well. But they did serve an amazing purpose in that the [character/environment] design was locked.  

“We worked very closely with the game designers as well as the game developers and the art department. The big challenge for us was how close could we get [the trailer] to look like the game characters but in higher fidelity.” 

To do this the team had to add a greater level of detail to the CG mesh and ‘up-res’ certain assets. “Not all the assets are up-ressed,” adds Sun. “It depends on how close they are to the camera. For example, we had to up-res the four genius characters to [higher] fidelity because we have more facial expressions and the animation is a lot more exaggerated.”  

“The game guys had really done a good job,” he continues. “The base mesh that they gave us was quite clean, we just had to add more edge loops and make the topologies suit our purposes. For props, when we were going for close up, it was all completely remodelled to a higher fidelity. The Houdini pipeline also allowed us to procedurally generate the island’s vegetation on the fly. We only had 16-17 artists working on this at the peak time so we had to work smartly and generate some assets procedurally and only up-res when we needed to. 

“We developed a very complex shader to create a new ‘Pixar’ look in the final rendering of some of the shots. There was a lot more work on the shader side rather than to [up-res] some of the models.” 

The biggest challenge however was water: “a problem on every VFX show”, observes Dulull. This was all created in Houdini and rendered using Redshift. 

“There are the two opening shots [with water simulations] and each shot is 1200 frames,” says Sun. “To simulate for that number of frames is a long process – you can’t really see the result until you render it.” 

A render farm was built for the project, but as this was only available halfway through the production, it was only then that work on the water VFX shots could begin.

Animation in motion
The character work is a combination of keyframe animation and motion capture.

The one-day motion capture session at Audiomotion Studios involved actor Josh Wichard being directed by Dulull under Covid-safe conditions, using the Unreal Engine to provide a real-time preview of the scene and the Vicon VCam, a real-time optical motion-capture solution which resembles an iPad on a handheld rig.

IMG_9907

Actor Josh Wichard provided the motion capture performance using Unreal scene to visualise it in real time on the set

“I could block out my camera moves as the actor was doing the motion capture performance,” says the director. “I could experiment, so my camera moves influenced the performance and the performance could also influence the way I do my camera moves. It mentally helped me as a filmmaker to really tell the story as I wanted to tell it and also communicated to the team exactly what the camera move was going to be. It was almost like being on set.”

Also on the stage was lead animator Christian Johnson, while character animator Kathryn Chandler, production manager Annie Shaw and the motion capture technicians were behind the Covid blue line. He Sun and the Rebellion marketing and game development teams were on set virtually, piped in on a massive screen. The actor and animators were also able to pitch in with ideas which could then be recorded as motion capture data in situ, rather than thinking of something after the event.

“I could block out my camera moves as the actor was doing the motion capture performance. I could experiment, so my camera moves influenced the performance and the performance could also influence the way I do my camera moves,” Hasraf ‘HaZ’ Dulull, director

“We had an Unreal scene running, so that when the actor was performing we could visualise it in real time on the set and address their feedback directly,” says Sun. “There was also a witness camera on stage to capture the [spatial] reference for the animation department and to see from different angles how the performance was happening.”

“We were seeing everything pretty much in real time and making decisions on the day,” says Dulull. “Having a motion capture session with [the team] there, able to give tips and notes and take notes themselves on what’s happening on the day, saved so much time. It removed any second guessing.”

The motion capture data was cleaned up and imported into Maya, where facial animation and movements were further stylised for dramatic effect. “We then exported that into our Unreal pipeline,” says Sun. “It was a very fast turnaround, taking around one day per clip.”

“When we recorded voices later on, we could adjust the lip sync and so on, but a lot of [the animation] came straight from the motion capture session because that performer was so good,” says Dulull.

eg2_stills_1.3.3

Casino: Due to timing constraints the entire casino scene was packaged in Unreal Engine

Big gamble
The team did so much work getting the island scenes and evil genius character sequences to look so good that time was limited for the casino walkthrough, the large middle section of the trailer.

“We knew this had to be just as good as the other sections,” says Dulull. “It’s contained in one interior environment, so Sun and I thought, let’s try doing it all in Unreal.”

The Rebellion artists packaged the whole scene up in Unreal Engine and sent it over to Dulull. “I downloaded it and started blocking the cameras,” the director says. “As I was doing the work in the actual scene, I could just send the camera moves as an FBX back to them. It all locked into place and then we started rendering some test shots.”

The casino scene was then rendered using Unreal Engine. “This was version 4.24. It wasn’t as high fidelity [as Redshift] and the anti-aliasing wasn’t as good,” Dulull explains. “Rebellion had a pipeline that was able to spit out several passes so we did a lot in Unreal, but we still need to do another level of compositing in Nuke to make sure everything was consistent with the rest of the show.”

Timing matters
“Our editorial pipeline at Rebellion is centred around Resolve, so our grading and editorial is all done using Resolve Studio,” explains Sun.

Dulull edited the trailer on a laptop in Resolve 16. “Rebellion was rendering out QuickTimes, so I’d go into Shotgun, download the latest one and put it in the edit,” he says. “This parallel development really helped.

“I was also colour timing and colour grading in Resolve. We had already [set] a look so it wasn’t like we were taking rough first-pass coloured renders and showing it to executives at Rebellion. We were giving a style and look to it, adding lens flares. I was doing all that in the edit. To be able to make those constant timing changes in Resolve really helped.

“There was no proxy rendering. I would send my Resolve file over to Sun, along with the QuickTime reference edit, and all they had to do was replace the MP4 with the high-resolution EXR files. When we had a last-minute tweak or where we had to do cut-down versions of the trailer, I would recut and export out a Resolve file. Rebellion would import that in and everything would just link. That’s been a godsend for us, especially when you’ve got so much back and forth with a marketing department and executives.”

The retiming facility in Resolve gets a lot of praise from the director. “We wanted a lot of speed ramps, where just as the camera comes in, it whips around,” he says. “Rebellion wanted to reflect the game experience in the cinematic trailer. We looked at Guy Ritchie’s opening sequences for Lock, Stock… or Snatch – we wanted that fun element. It takes just a right-click to create curves in Resolve and then find the right timing. I could send that back to Sun and his team and they’d know [instantly] what to do something with the animation.”

New world
Creating the cinematic trailer was a six-month project in the middle of the pandemic lockdown. “We had a time constraint and there were budgetary constraints as well, but Rebellion has got a good reputation for doing quality stuff, so we knew we had to be on par with the things that are coming out,” says Dulull. “If we were too comfortable, some of those creative and technical decisions would not have happened.”

As for working in real time, he thinks it can be both a blessing and a curse. “Real time works for creatives like myself and Sun and the team as we’re seeing instant feedback, but the curse is that executives or decision makers are also able to rethink the script.”