UHD 4K TVs are quickly taking their place in people’s homes. Content creators are beginning to produce Ultra HD 4K in earnest to fill the programming pipeline.

It is time for broadcasters to consider how UHD 4K fits into their overall offering.

The critical considerations are bandwidth and video quality; but the industry doesn’t yet have the same level of experience setting bitrate & video-quality targets for 4K content as it does for high-definition (HD) and standard-definition (SD) content.

The industry needs tools to validate the quality of uncompressed and compressed UHD 4K content so that broadcasters can plan bandwidth resources with confidence.

This paper will describe an innovative and practical set of statistical methods that can be used to validate UHD 4K content and examine the impact of high-efficiency video coding (HEVC) compression.


Only a decade ago, high definition HD was the big new thing. With it came new wider 16:9 aspect ratio flat screen TVs that made the living room stylish in a way that old CRTs couldn’t match.

Consumers delighted in the new better television experience. Studios and broadcasters delivered a new golden-age of television.

HD is now table stakes most places, and where that is not yet the case, it will be soon enough.

Yet now, before we hardly got used to HD, we are talking about Ultra HD (UHD) with at least four times as many pixels as HD.

Is all that extra UHD 4K resolution going to make a difference to consumers? If yes, what bandwidth will UHD 4K programming need?

Those are two big questions our industry is exploring with respect to planning UHD services; yet they are not independent questions.

UHD 4K is still new enough in the studios and post-production houses that 4K-capable cameras, lenses, image sensors, and downstream processing are still being optimised.

Can we be sure yet that the optics and post processing are preserving every bit of “4K” detail?

On the distribution side, could video compression change the amount of visual detail to an extent that it could conceivably turn “4K” quality into something more like “HD” or even less?

If the UHD 4K content we have available today for bandwidth and video quality testing does not truly have a “4k”-level of detail, then we could go astray and plan for less bandwidth than we might need for future UHD 4K broadcasts.

If the 4k content we have available today is truly “4K”, then we should also want to be sure that we do not over compress and turn UHD 4K into something less impressive.

During our UHD 4K testing, we have found several test sequences that appeared normal to the eye but turned out to have unusual properties when examined mathematically.

Such content could lead to wrong conclusions when planning for UHD 4K services.

In this paper, we present practical mathematical techniques to help answer the question “How 4K is it?”

Our method examines UHD 4K video to see if it has a statistically- expectable distribution of spatial detail as a function of 2-dimensional spatial frequency.

The benchmark for our statistical expectations is drawn from numerous studies of the statistics of natural scenes.

Our main objective in writing this paper is to describe a methodology that might be useful in helping to decide which UHD 4K content should be included in any video test library intended for use in bandwidth and video quality planning.


An image is normally thought of as a 2-dimensional array of pixels with each pixel being represented by red, green, and blue values (RGB) or luma and 2 chrominance channels (for example, YUV or YCbCr).

An image can also be represented as a 2-dimensional array of spatial-frequency components.