With the vast increase in OTT and streaming services supplying a ‘captive’ audience during lockdown, and new streamers investing fortunes in the market this year, the battle for eyeballs is fierce. One of the deciding factors in this struggle is quality, but how do D2C services make sure that, whatever the bandwidth, viewers receive a decent, reliable stream?
Are you one of the 193 million (give or take a few) customers of Netflix? Did you join the 50 million or so who signed up to Disney+ during the first five months of its launch?
The latest Living with Digital consumer research study from Futuresource identified that pure OTT services (SVoD, AVoD, and free to air) accounts for 49% of time spent watching video content across North America and the top five European countries combined. That’s a lot of streaming, so how do the OTT giants make sure the quality of service and enjoyment is equally as high?
All OTT customers depend on the internet service provided by their ISP. Delivering content over a managed service provided by a cable network or satellite provider can guarantee a certain level of quality, but as the internet is ‘unmanaged’ it brings with it different issues. With consumers depending on different ISPs for delivering streaming content over the internet to smart TVs and mobile devices, there is a danger that an audience watching the same content at the same time, such as a live football match or special event, might be experiencing wildly differing quality.
The streaming companies themselves have moved to allay this. After all, if your audience is experienced stability issues and poor-quality content, they’ll vote with their feet, and wallets, and move elsewhere for entertainment.
“Streaming performance can be optimised by striking a balance with video compression between image quality, latency and scalability as well as utilising a content delivery network [CDN] that places caches of frequently accessed content closer to their audience,” says Chris Evans at Futuresource Consulting.
The content delivery networks Evans describes are globally distributed networks of proxy servers which cache content, such as online videos, more locally to consumers, thus improving access speed for downloading the content. As an example, Google Cloud, the CDN of the YouTube owner, is made up of data centres connected to edge nodes and globally distributed edge Points of Presence (PoPs). Called a Google Global Cache (GGC), each edge node allows ISPs to serve Google content from within their own networks. Static content that’s very popular with the local host’s user base, including YouTube, is temporarily cached on edge nodes. Google’s traffic management systems direct user requests to an edge node that provides the best experience. Google also has peer relationships with ISPs which allows them access to YouTube front-end servers and video caches in Google’s edge PoPs. Again traffic management chooses the best and shortest route for the content for each user request, thus reducing latency and ensuring capacity for large traffic spikes (big live streamed events or online game launches).
Google isn’t the only self-serving streamer. The CDN that Netflix uses to serve its video content, Open Connect, was originally developed in 2011 as a response by Netflix itself to the ever-increasing scale of its streaming, while Amazon Prime Video runs on CloudFront, a CDN offered by Amazon Web Services (AWS). Other streamers use multiple third-party CDNs for video content distribution; for its European operations, Disney+ reportedly has used a combination of six CDNs from the likes of Akamai, CloudFront, Limelight Networks, Edgecast, Fastly and Level 3.
- Watch IBC 2020 Panel discussion with Amazon, Facebook, Google, Intel, Netflix, and Tencent on AV1 Commercial Readiness
On the ladder
Smarter video compression (in the form of encoding) is also being deployed. Programme content is pre-encoded by streaming companies, initially based on a ‘bitrate ladder’, first developed by Apple back in 2010 for HTTP Live Streaming (HLS). These are charts used as a ready reckoner for the best bitrate to encode content at a set
resolution for streaming on a range of devices over a range of connection speeds without significant encoding artifacts. Streaming services use adaptive bitrate (ABR) technology to provide a range of bitrates, so enabling the player to adjust based on available bandwidth.
“Players can automatically increase to higher bitrates to get the best quality for the device, whether it’s a television or cell phone, or adjust lower bitrates from the network when bandwidth declines, to avoid or minimise the spinning buffering indication on the player,” says Dan Murray, director of product marketing at Telestream. “ABR technology is designed to adapt to IP networks that can fluctuate in bandwidth.”
Though ground-breaking at inception, Apple’s bitrate ladder can be a one-size-fits-all approach and has been superseded by other methods, most notably by Netflix in 2015 when it developed ‘per-title’ encoding. This seeks the best possible correlation between perceived picture quality and bitrate, resulting in significantly better video quality at lower bitrates. The technique relies on a complexity analysis of the assets to calculate a bitrate ladder that is different for every single video.
The technique has in turn been honed over the past five years. Bitmovin, which has developed per-title encoding for its products, optimised an algorithm to work along the optimal ratio between PSNR (an objective quality metric) and bitrate/resolution. Mux Video takes an alternative per-title approach that uses deep learning, and which it claims delivers per-title ABR ladders in seconds, compared to the hours it claims would be required by the Netflix technique.
To deal with low-bandwidth internet connections, specifically on mobile devices, Netflix started to use the ‘per-chunk’ encoding method in 2016, where the video source is split up into a number of chunks, each of which is processed and encoded independently. According to Netflix, the ‘mobile encodes’ optimise the bitrate for each individual chunk based on its complexity (in terms of motion, detail, film grain, texture, and so on). This reduces quality fluctuations between the chunks and avoids over-allocating bits to chunks with less complex content. Other streamers have also adopted the method.
The latest development from Netflix is a shot-based encoding framework, called Dynamic Optimizer, resulting in more granular optimisations within a video stream. According to Netflix, this analyses an entire video over multiple quality and resolution points in order to obtain the optimal compression trajectory.
To further eke out the best bitrates, the encoding formats are being stress-tested too. In development by the ITUs Joint Video Experts Team (JVET) is the Versatile Video Codec (VVC), a successor to the HEVC (H.265) compression used by the likes of Hulu and Amazon for streaming 4K content. According to the ITU, VVC will need only half the bitrate of its predecessor to achieve the same level of video quality for high-resolution video content.
It’s not the only format that may determine future streaming quality. The majority of the streamers are part of the Alliance for Open Media (AOMedia), and as such have also been developing a new coding format. The AV1 video compression format and codec is built upon the WebM VP9 format, which is itself optimised for streaming media over the internet and is currently deployed by YouTube, Netflix and others. However AV1 reportedly incorporates additional techniques that give encoders more coding options to enable better adaptation to different types of input. YouTube and Netflix have started to roll out AV1 to support streaming to Android, with Netflix claiming 20% improved compression efficiency over VP9 streams.
“Despite these efforts, D2C streaming services are still ultimately at the mercy of network providers to deliver a reliable stream to viewers,” says Chris Evans. “As a result, monitoring the integrity of video feeds is a priority in the operation of a D2C service. In traditional broadcast playout, a multiviewer was typically used for this purpose, but in the world of D2C OTT distribution where every individual viewer is receiving a unique feed, sophisticated video probing tools are required to make this manageable.”
Dan Murray agrees. “The ABR delivery chain for streaming services is maturing in technology and industry best practices, as evidenced by the mostly successful explosion of streaming service delivery during the pandemic,” he says. “However, along with the improvements often comes complexity. Added security is required to protect content, advanced dynamic advertisement personalisation, multiple content delivery networks, and a rapid expansion of services. Telestream believe monitoring is fundamental to successful deployment, efficient operation and proactive assurance of customer quality of experience.”
That performance is about to get tested even further. According to the Cisco Annual Internet Report, 2018–2023, the use of 4K/UHD video streaming will have a huge impact on streaming traffic and demands for available bandwidth. It states: “the bitrate for 4K video at about 15 to 18Mbps is more than double the HD video bitrate, and nine times more than SD video bitrate. We estimate that by 2023, two-thirds of the installed flat-panel TV sets will be UHD, up from 33% in 2018”.
Nonetheless, Dan Murray predicts technology maturity and increased speeds from faster broadband and 5G wireless will continue to be a positive for customers.
“The already complex streaming workflow will continue with trends including accelerated shift to cloud workflows, dynamic ad-insertion, and lower latency technologies,” he adds. “Telestream monitoring solutions are expanding to address these trends with monitoring cloud services, dynamic ad-insertion monitoring, low latency compatibility, and increased automation to help operators’ teams keep up with the rate of technology change and scale of streaming services.”