The technology to support end-to-end production of high dynamic range (HDR) and wide colour gamut (WCG) content is now proven and readily available.
Production companies are increasingly shooting in HDR, and there are also standards available for content delivery in HDR.
For broadcasters, Hybrid Log Gamma (HLG) retains the goal of backward compatibility.
For online services, which can tolerate the need to deliver a large packet of metadata at the start of the content, HDR10 is successful.
Two approaches to delivering content: HLG and HDR10 explained
But while new content can take advantage of HDR’s finer colour granularity leading to more depth and realism, what about all the content that has been shot over the last 50 years or so?
Can anything be done about it?
Inevitably, the consumer electronics industry has jumped in with solutions to grab the headlines and impress the browsing shopper.
LG was the first to offer in-built SDR to HDR conversion.
The Optoma 4K home cinema projector has what it claims is circuitry to “enjoy near-HD content from any standard source – providing enhanced contrast, detail and colours to all SDR content”.
A2ALogix has embedded it into encoding algorithms.
At NAB this year there was much talk around new research, from French lab b<>com.
Its conversion algorithms are implemented in FPGA today, with multi-GPU solutions also available albeit at a time penalty, running at perhaps a quarter real time.
A software-as-a-service cloud release is promised soon.
“This technology is the gap filler the industry has been looking for in order to make HDR succeed quickly,” said Ludovic Noblet of b<>com.
“Our aim is to help the broadcast industry turn the promise of HDR into more concrete delivery to audiences.”
That raises a number of questions. If we ignore the commercial ones for the moment, there are two we should consider: technical and aesthetic.
Directors and cinematographers may have been working within the technical limitations of the time, but they made artistic choices to create the best pictures they could.
The stunning BBC drama The Night Manager reflected the vision of director Susanne Bier.
And the original 1990 Twin Peaks looked exactly the way David Lynch wanted it to look. So is it right to change that vision?
If it is, how can it be done? How can colours be created in a downstream process that were not in the original?
According to the work at b<>com, it is a very subtle process and one size definitely does not fit all. The research team did a lot of work on colour perception.
The most obvious process is to increase the luminance, to fill the expanded dynamic range.
But simply increasing the luminance brings in the Stevens effect – that contrast perception increases with luminance – and the Hunt effect – that “colourfulness” increases with luminance.
We know these are very real: in low light it can be hard to read because the words no longer jump off the page.
The b<>com solution therefore applies colour correction after the tone expansion, the luminance shift.
But at this point it becomes an artistic endeavour, which needs to be sympathetic to the original if the mood is to be preserved.
According to the influential online source Ars Technica “the results were good: the HDR images ‘popped’ just as you would expect.”
But the author cautioned that this was on demonstration material, presumably chosen to show the process in the best light.
“I would have liked to have seen something more representative of what might actually be broadcast on TV, like an episode of House or perhaps an old movie.”
If broadcasters are to provide Ultra HD services including HDR then it is likely they will need SDR to HDR conversion.
Even if they shy away from changing favourite old programmes and movies, they are likely to want to up-convert archives used in sport, for instance.
It is cost-effective and easy to deploy from glass to glass,” said Bertrand Guilbaud, CEO of b<>com. “To be successful with HDR, the industry needs such enablers.”