ABSTRACT

With a new high-dynamic-range (HDR) and wide-colour-gamut (WCG) standard defined in ITU-R BT.2100, display and projector manufacturers are racing to extend their visible colour gamut by brightening and widening colour primaries.

The question is: how close is close enough? Having this answer is increasingly important for both consumer and professional display manufacturers who strive to balance design trade-offs.

In this paper, we present “ground truth” visible colour differences from a psychophysical experiment using HDR laser cinema projectors with near BT.2100 colour primaries up to 1000 cd/m2. We present our findings, compare colour difference metrics, and propose specifying colour tolerances for HDR/WCG displays using the ΔICTCP metric. 

INTRODUCTION 

From initial display design to consumer applications, measuring colour differences is a vital component of the imaging pipeline. Now that the industry has moved towards displays with higher dynamic range as well as wider, more saturated colours, no standardised method of measuring colour differences exists. 

In display calibration, aside from metamerism effects, it is crucial that the specified tolerances align with human perception. Otherwise, one of two undesirable situations might result: first, tolerances are too large and calibrated displays will not appear to visually match; second, tolerances are unnecessarily tight and the calibration process becomes uneconomic.

The goal of this paper is to find a colour difference measurement metric for HDR/WCG displays that balances the two and closely aligns with human vision.

Download the full technical paper below