Esports saw huge growth opportunities in 2020, with broadcasters turning to remotely produced competitive gaming to fill the sports void during Covid-19 lockdowns, as panellists discussed during SMPTE 2020. 

Counter Strike esports St Petersburg 2017 credit Roman Kosolapov Shutterstock

Esports: Continues growing in popularity as an alternative to traditional sport viewership 

Source: Roman Kosolapov Shutterstock

While other sports have faced significant challenges in 2020, the already-growing land of esports has had the opportunity to excel. 

People stuck inside during various coronavirus lockdowns played more competitive games, from Counter-Strike to League of Legends, while also watching the world’s best gamers do battle online. With sports out, esports have helped to satisfy demand for competitive action from viewers.   

Esports usage hours and reach rose by an estimated 30% in the first eight weeks of lockdown according to British Esports CEO Chester King, who spoke to IBC365 about the opportunity Covid presented for the gaming industry. 

Several broadcasters turned to esports during lockdown to satisfy demand for competitive sport and to fill airtime left empty by the cancellation of events.  

The BBC, for example, has rekindled its interest in esports during lockdown. The corporation experimented with esports a few years ago, but coverage stalled amid questions about how competitive gaming fitted in with the BBC’s remit. In recent weeks, it has stuck broadcast deals for League of Legends, Rocket League and women’s motorsport championship W Series Esports. 

For the companies involved in making an esports tournament work, there were numerous challenges. 

Blizzard Entertainment is one of the biggest game publishers in the world, and a big back of esports, with several of its titles – such as Overwatch and League of Legends – extremely popular among esports aficionados. 

Corey Smith Blizzard

Corey Smith 

Blizzard director of live operations Corey Smith explained at a recent SMPTE 2020 panel: “In 2020, the show must go on. With Covid-19 we moved online, and we had to roll this in about 30 days.  

“It started off with, ‘hey, let us get all the gear that we need for the talent and for the crew. Let’s get it shipped, packed up, sent out to wherever it needs to go’. But in the meantime, we also had to take a technology-based approach where we had to figure out where we’re going to host the game servers because all of our tournaments were basically land based operations.” 

Blizzard needed to make the game servers available online with a low enough latency to protect the integrity of the competition. Smith’s team built workstations which were then shipped out talent for observation. Meanwhile, they migrated all of their production systems into a cloud environment. 

Blizzard shipped over 200 Logitech cameras to players all over the world. It also created 25 broadcast kits – featuring backdrops, lights, mics and other broadcast gear - which were shipped to casters and analysts in less than a week. 

“We had to create a complicated comms system so that used point-to-point and established scripts so talent could talk to the production crew, production could talk to different parts of their operation, in order to facilitate production,” he added. After around 30 days, Blizzard loaded the final media assets into the cloud production system ready for rehearsals. 

One key reason for the swiftness of the switch to cloud was because the company, and esports in general, are already digital-first by their very nature. 

“It was a pretty big lift,” he explained. “We were able to do it because a lot of our systems are very much IP first, digital first. All of our transmission and other systems have already been in cloud for some time. It was really a question of what do we do with the production? And that’s where we had to develop a lot of our home workflow operations.” 

Blizzard cloud production setup

Blizzard cloud production

When you consider the workflow, there needed to be “a retiming effect” so there was no delay between observers, participants, commentators, and the production crew so they look and feel in time within the broadcast, with no delay. 

“As we worked down the workflow, we got our match show together, which is the play by play guys. Then we get into the analyst show - our halftime shows and then we would throw it back to our studio in Los Angeles, like major sports does today. We adopted the same look and feel of how some of the normal sports shows are constructed.” 

Blizzard worked with Grass Valley for the production leveraging a pre-existing partnership. Originally, Blizzard used Grass Valley for a mass control product, but “Covid changed that model where we had to basically adapt this mass control capability into a full-fledged cloud production environment” he added.  

Observers were mixed into broadcasts, which filtered into an observer station sub-switch, leveraging Grass Valley’s AMPP (Agile Media Processing Platform). This then feeds into the main PCR (production control room), which also takes a feed from the casters (those playing the games) from two locations. And this feeds into the master control relay.  The MCR also receives a feed from the analysts (who commentate on contests) and a vMix replay.  This is all then filtered through to YouTube. 

“We are calling the play-by-play between the venue,” explained Smith. “In this case, it would be the guys sitting in a house down the street potentially working their magic on the main PCR shell. And the main PCR show is really the broadcast ECM line, which is the analyst and the gameplay footage. 

“The actual tournament itself, the analogue show being halftime, they’re able to analyse what they’ve seen. We can repackage with video replay whatnot. And then once it rolls up, the mass control, again, is the final constructed output, which is all of our regional feeds, all of our online feeds, or all of our line cords come off of our mass control. These eventually end up on YouTube and other partners throughout the world.” 

Racing into remote 
Smite was joined on the panel by speakers from Ross Video, Engine Media, and Grass Valley. Engine Media’s global head of Esports and Business Development Darcy Lorincz talked through “constantly evolving [esports] production ecosystem”. Engine has hosted millions of esports events, he explained, using its virtual master control system. With “people all over the place” using the system, you have multiple feeds that need to be managed, with inputs from game publishers, players, analysts, viewers and more. 

“It all behaves very much like what you’re used to, except it’s all coming from servers typically and not coming from physical devices,” he explained, adding, “and where those servers are as close as you can get your capacity production, the better the performance. And we always are looking at performance and latency and all the things that matter in our world.” 

Engine runs esports events focussed on competitive racing games, meaning low latencies are vital. All of the feeds “come through an orchestration layer” which is managed by Engine. That means much of the work, such as graphic overlays, are managed by machines rather than people. 

“The challenges are a little different than having regular races, because you have people sitting in simulators all around the world,” he added. “You have to start thinking about where they are and the latency.” 

There are also challenges on the feed side, such as location distance, that “might not happen when you’re a studio or you’re at a track”. But “ultimately, once you’ve conquered the network and the infrastructure that supports that, can stay relatively fixed. And things don’t change until the publisher upgrades the game.” 

esports

Blizzard: Shipped over 200 Logitech cameras to players around the world

Taking the multi view 
Ross Video’s Cameron Reed is a former esports producer and director, but now heads up esports business development for the firm. 

During the SMPTE panel he outlined how remote production can be used to facilitate and esports tournament. 

“That is a hybrid approach of software cloud services, as well as a traditional hardware infrastructure, and a lot of Esports companies have been turning to in order to keep high production values.” 

Remote production has been “picking up steam” explains Reed, but 2020 has seen an overhaul even away from REMI, with not just separate source acquisition and production, but also the need to split off bits of production from one another. 

Covid means moving production into distributed solutions that allow not only for remote source acquisition, but also for remote operation and control of all of the devices that were necessary in order to produce live television. 

“Esports was in a unique position, because much of it is already played online,” he added. “So, they knew the competition was going to carry on, effectively undeterred by Covid-19. But that did not mean production was not deterred. They had to figure out how to get on here. So, what they did really quickly in the first days and weeks was they pivoted to the software only solutions that are available.” 

This exposed some weaknesses, such as the need for graphics and instant replays that were not easily supported by software-only models. “Then these companies started asking themselves when we own all of this infrastructure back in the studio that we’re not allowed to get into, but what if we can access it remotely?” 

Reed explained that the first thing esports production companies needed was a centralised interface that all users can connect in to. This is not just limited to those working on production – participants in the competition would also need access, as they are remote contributors providing a feed from their home. 

Esports companies split those connecting to this portal into three main categories: operators, contributors and engineers/ production team. 

He explained that operators “need to get multi viewer, they need to have comms, they need to be controlling their devices” while contributors need to be able to see and hear each other in incredibly low latency in order to keep it conversational and natural. They need to be able to see whatever multi viewers they want to see. “And in the case of the players, they need to be able to not see anything in most cases, right?” This is because if a player can see a feed, it may give them an unfair advantage. 

Then you’ve got your engineers and the people who actually need to administrate the production and keep an eye on the network behaviour, and all of the distributor’s PC health and any other tech relating to the production.  

“Now to do this, all of them are connecting to this web portal, which connects to the cloud. And then the cloud, the web portal connects to a production control room. And the production control room now is scalable, to the extent that that production control room would have ever been scalable, instead of the limited number of scalable end users.” 

Robert Erickson Esports workflow (1)

Robert Erickson: Esports workflow 

Competitive compression 
The final speaker was Robert Erickson of Grass Valley, who discussed the infrastructure needed to support remote epsorts productions. 

He explained: “On esports, the event size dictates a lot of the workflow solutions. How do you efficiently design at depth a solution that can be accommodating to a small regional match all the way up to a massive worldwide championship? That is a big change.” 

Producing a smaller event, which may only have a small number of players or a particular regional focus, with the same quality of a much larger scale production, is challenging, he acknowledged.  

Many broadcasters have traditionally supported big remote production or hybrid productions for major sporting events, using high bandwidth connections over dark fibre. 

“We’ve been doing that for a while - wavelengths over fibre,” he added. “More and more popular now is doing Ethernet networks with layer two or layer three or MPLS. Most people are getting 10GBPS connections now for these hundreds becoming standard, but I am even seeing one customer now doing 400 gigabit links is an incredible amount of bandwidth, but also entirely doable now.” 

On smaller esports productions, you “cannot just throw bandwidth at it” he adds, meaning these productions require more of a hybrid model with some hardware and some software. 

“Now you can have producers and directors and TDs (technical directors?), all working at multiple different locations, doing multiple shows using a common in a cloud based or a data centre-based infrastructure. That is pure software.” 

But once this begins to scale, you need multiple feeds for video and audio in order to support the remote production. This requires a robust transport layer, even if the production is being done in the cloud. 

“The transport layer is absolutely king in this part of the production. Because you must use the codec a transport layer that meets your requirements,” he added. “If you’re on a home broadband connection that has a lot of jitter and limited bandwidth, you’re probably going to use in some more error correction.” 

Due to limited bandwidth, higher compression is often needed. AVC (H.264) is “more or less the gold standard for most feeds today” he explained. But he added “we traditionally see much higher delays, depending on how much bandwidth you have.” H.265 is “significantly more efficient” but also takes more processing, which leads to more delay. 

“There’s a tonne of options out there, all of them need to be considered because all of them have their workflow place all of them can have, as you need to have as part of the tool set to make these productions work. But the key takeaway from this is, is you will have to manage video with multiple codecs and a widely varying time delay.”