Low-latency live streaming requires measures to maintain viewing quality, says StackPath VP platform services Josh Chesarek.
Streaming – whether it be music, sports or videos – is the preferred method of entertainment for millions and millions of people around the world, and the traction it has achieved doesn’t show any signs of slowing down. In fact, I think it will end traditional entertainment as we know it. But it stands to reason that as streaming becomes more popular, so the amount of data that needs to be transferred quickly and seamlessly to millions of eyeballs around the world continues to increase.
Luckily, live streaming media is continuing its movement to lower latency technologies. Some of these make use of proprietary software stacks that require deployment to servers around the world, while others utilise HTTP and HTTP2. Some companies even provide a full suite of edge services that give you the flexibility to utilise either path. You can operate your own software stack at the edge on virtual machines (VMs) or containers to give you a customised solution at scale or allow you to utilise a content delivery network (CDN) to deliver your content.
Low-latency streaming brings the viewing experience in line with other traditional means of delivery. Part of this involves minimising any delay to the media chain from start to finish. When the entire chain is only one to two seconds, stability and efficiency are key. Previous delivery methods with longer buffers allowed for easier recovery when bandwidth changes or other events occurred. Those recovery efforts now have to be contained inside a two-second buffer if they cannot be avoided outright.
It is clear that low latency streaming quickly loses its appeal to end-users if the experience is riddled with stuttering, buffering or constant bandwidth changes. If you have ever watched a live sporting event that froze up right when the basket, goal, touchdown or winning run was about to be scored, you know first-hand how frustrating this can be. To avoid this, companies such as StackPath continue to push technology and services to live on the edge; to get as close to consumers as possible while removing as much of the internet as possible.
In doing so, the companies that are taking streaming to the edge are being mindful to ensure a few critical points for success, including: excellent connectivity to end users with excess throughput and low RTT; redundancy in custom-designed infrastructure to power edge services; and CDN scalability to handle massive concurrent events.
These core features help deliver low-latency streams while maintaining the quality of service for HD streaming. A huge part of this is avoiding common pitfalls that result in a poor viewing experience such as server errors, server load or network saturation. Low-latency streaming requires connections to be more reliable and robust than ever. This is not something that traditional cloud services are in a position to offer. Instead, it takes a company like StackPath that has been built from the ground up to be ready for all these new demands.
Josh Chesarek is VP platform services at StackPath.