Creative agency The Famous Group caused an AR sensation when it sent a giant-sized raven swooping around a stadium during an American football game. Michael Burns looks at how it worked with The Future Group and its cross-reality content production tool Pixotope to bring the mascot to life.
Baltimore Ravens fans at the team’s M&T Bank Stadium were witness to an extraordinary augmented reality feature, when a giant-sized raven was seen to swoop and land in the stadium and respond to in-game live events.
While guaranteed to ruffle the feathers of opposing fans, the giant CG corvid was also a great example by creative agency The Famous Group of its work in augmented and mixed reality.
“This was an exclusive Thursday night game for the Ravens and one that had a Division Championship on the line, says Jon Slusser, partner and owner, The Famous Group. “The Ravens wanted to create a unique fan experience to go along with the gravity of the game and tie it to their mascot.
“Fans were able to see the Raven on the screens at M&T Bank Stadium as this was exclusively an in-stadium experience,” adds Slusser. “Of course, it also ended up getting a few million views on social media, which we are very proud of.”
The mixed reality experience was produced using Pixotope from Oslo-based developer The Future Group. Pixotope creates a virtual environment, which contains lights, a camera and objects to be rendered. In this case, the virtual scene in Pixotope contained the animated raven, a laser-scanned model of the stadium, as well as a camera and lights to match those in the real-world stadium. Additional video hardware was provided by Quince Imaging.
‘The Ravens wanted to create a unique fan experience to go along with the gravity of the game and tie it to their mascot’ - Jon Slusser, partner and owner, The Famous Group
“We’ve been working in the AR/MR space for some time using our internally developed platform, Vixi, but this was one of the first of many larger narrative pieces you’ll see from The Famous Group,” says Slusser. “Vixi is great for driving our simpler AR executions, like jumbotron face filters, but to step up our game in the higher quality MR world we needed a more powerful rendering tool – that’s where Pixotope came in.
Three days before the event, The Future Group’s team arrived at the stadium to integrate the various elements of content and prepare it for rehearsals, adaptations and eventually for delivery of the final live production. Apart from providing the core technology to execute the mixed reality visuals, The Future Group also provided expertise to act as a liaison between all the various departments which included camera, live programme vision mixing and in-stadium display systems.
“First and foremost, The Future Group took time to understand the project and help our strategy as we communicated back to the client,” says Slusser. “Their expertise in the space was a big value-add.”
A major task was to prepare the raven digital model for real-time production
“There was a lot of consideration around performance, which Pixotope helped us dive into, adds Slusser. “We worked directly with the team to be sure to capture the true spirit of the Ravens mascot. We also enlisted a trusted animation partner, Flux Animation Studios out of New Zealand to help create the character.”
“One key difference between live real-time computer animation and the more traditional use within a post-production environment, is that there are many more variables to prepare for with live events,” says Øystein Larsen, The Future Group’s chief creative officer. “Improvements and adaptations to the augmented elements occur right up until the last second and, therefore, the duration of shots cannot be precisely known ahead of time.”
Normally, CG animated objects are keyframed to run their designated animation and then stop. This does not work for a live scenario. For example, a brief for the raven to fly into the stadium and land on the goal post, squawk and then take off again would require the timing of those segments to be pre-set into the animation. As this was live, the show director needed to call and dismiss the raven on cue and the duration of shots depended upon unfolding events during the game.
To allow for this scenario, The Future Group relied on the ability of Pixotope to fully access the underlying Unreal Engine. This enabled the use of game logic to migrate between different states and animations of the CG raven model on demand. By doing this, the raven could be instructed to loop a certain section of the animation, for example, while it waited on the goal posts, until the director’s cue. At this point, the game engine could be triggered via Pixotope to merge to a different animation, such as making the raven take-off and fly away.
In this way the team was able to create ‘live’ animation to move each part of the raven model from the position it was last in, to the position set out in the next animation segment. This process led to a seamless, jump-free connection between the various animations.
A light on the night
An important task for The Future Group’s team was to accurately sample the stadium lights, so that when the digital raven was incorporated into Pixotope’s virtual environment, it would be lit exactly as it would have been had it really existed in the stadium itself.
‘We received over 11 million views of the Raven in flight on various social media platforms’ - Jon Slusser, partner and owner, The Famous Group
“We initially used 360° HDR photography to measure both the location and the relative brightness of each of the light sources,” explains Frank Daniel Vedvik, senior product specialist at The Future Group. “Accurate replication of the lighting is essential to ensure the augmented elements look real when added to the live background. In fact, we had almost 30 real-time light sources within Pixotope’s virtual environment, including bounced green light to mimic the effect of reflected light from the green turf of the sports field.”
“In addition to Flux for the animation, we utilised TruePoint Laser Scanning for the stadium laser scan, 740 Sound for all of the Ravens’ sound design, Quince Imaging for engineering and hardware support, and StYpe for camera tracking,” says Slusser.
The laser-scanned model of the stadium, which was also imported into Pixotope, was used as a shadow catcher. This is a CG model which accurately replicates the shape and form of a real-world scene, but one which is not rendered out in vision directly. Only the shadows that fall upon the model of the stadium are rendered, which are then composited over the live action shots. This technique resulted in the augmented raven accurately and realistically casting shadows over the live shots of the stadium.
The shaders used to render the digital raven were adjusted to ensure that the model could be positioned anywhere within the stadium and be able to properly react to the lighting in any zone.
“The modifications ensured maximum flexibility to match the raven to the high contrast lighting changes between different stadium areas, while at the same time also ensuring that the digital raven could be easily rendered within the Pixotope system in real-time at 59.94 frames per second,” recalls Frank Daniel Vedvik.
Raven and cheering
Another requirement was to have the AR raven perch between the goal posts, a tricky proposition due to the viewing angle of the goal posts: the closest post to the raven would have to appear in front of the bird. Since it was a live production, shot from a freely moving camera with a variable zoom lens, post techniques such as keying or rotoscoping would not be possible.
The solution instead was to build the goal posts as a 3D object within Pixotope, accurately matching the size and position of the real thing. This was used to mask off part of the alpha channel of the raven, so that when it was added to the background, the part that would cover the foreground post was effectively erased. This presented the audience with the illusion that the raven appeared to perch.
Once the virtual environment within Pixotope was set-up, the resulting images were layered over the background camera shot. Stype was used to provide the camera tracking information to Pixotope to allow it to simulate the position, viewing direction and lens focal length of the real-world camera’s lens. This process ensured that the virtual raven was ‘filmed’ correctly from exactly the correct angle.
Quality of the background shots must be preserved when compositing augmented reality.
“If the background image is brought into the virtual environment and then rendered back out again, it will almost certainly be degraded due to the addition of anti-aliasing, motion blur, unnecessary colour conversions and the like,” says Larsen.
‘Accurate replication of the lighting is essential to ensure the augmented elements look real when added to the live background’- Frank Daniel Vedvik, senior product specialist at The Future Group
In Pixotope, the background shot is present in the virtual environment as a light source to affect the CG objects, and these are then rendered over a direct feed of the background shot. So, when Pixotope augments material onto a camera feed, the original image qualities are left perfectly intact.
Lap of victory
Once all the technical aspects had been set up, rehearsals could begin. Since Pixotope works in real-time, adjustments and improvements to most aspects of the production were able to occur right up until the last minute. The final execution worked flawlessly, with the giant raven swooping into the stadium on cue, which exhilarated the attending audience. Jon Slusser was thrilled at the outcome.
“The overwhelmingly positive results were felt in the stadium, on social media and with traditional media outlets,” he says. “We received over 11 million views of the Raven in flight on various social media platforms, all referring back to the Ravens’ original social media posts. We also saw a huge spike with traditional media outlets like ESPN, Bleacher Report and Sports Illustrated”.
It’s the not the only such production The Famous Group has been involved with. “We did a really big mixed reality production at the NFL’s Super Bowl LIV back in February, again utilising Pixotope,” says Slusser. “This was a multi-camera, seven-minute show that was seen both in-stadium, and on the FOX Sports broadcast by over 85 million viewers. Click here to watch the pregame presentation on Vimeo.
“We’ve done a couple other smaller pieces since,” he continues. “We are very committed to mixed reality – we’ve been building up our MR talent pool with some heavy hitters and plan to build upon our successful executions as we look towards the future.”
Join IBC’s virtual event IBC SHOWCASE (8-11 September). Online and on demand, the IBC SHOWCASE provides you with a flavour of IBC. Register free to join your community.
- Read more: Sport’s project restart