Wide-Gamut Displays

Emerging wide-gamut displays, enabled mainly by LED-backlit direct-view LCDs, will provide users with more utility and greater entertainment. But stumbling blocks remain for full adoption. How can display makers ensure a smooth transition to a wide-gamut world?

by Charles Poynton

THE COLOR GAMUT that we have been living with for the past several decades is, today, standardized as "Rec. 709" 1 (in the video industry) or "sRGB" 2 (in the computer industry). These systems share the same primaries. The gamut of Rec. 709/sRGB is bounded in 3-space by the triangle, depicted in Fig. 1, formed from the chromaticity coordinates of the primaries (see the sidebar, "Chronology of Display Primary Standards," for a review of primary chromaticity – or "color space" – standards).

Gamut coverage is a strong function of luminance level. Fully saturated blue, lying at the blue vertex of the triangle, can be produced up to about 7% of full white luminance, but achieving luminance higher than 7% requires adding red or green. Adding either (or both) of those reduces the color saturation of the blue, bringing the chromaticity in toward the interior of the triangle and reducing the gamut in the vicinity of blue. Similarly, fully saturated green can be produced up to a luminance of about 72%, but above that level, gamut in the green region is reduced. At white, gamut shrinks to a point. The dependence of gamut upon luminance is not easy to portray in a 2-D diagram, and display engineers tend to underestimate its importance.

The Motivation for Wide Gamut

The Rec. 709/sRGB color gamut does not cover all of the interesting and important colors. The small filled squares in Fig. 1 show the colors of the 24 patches of the Gretag-Macbeth ColorChecker chart (six of those patches are gray and overlay close to the middle of the plot). Many of the Macbeth colors lie on the Rec. 709/sRGB gamut boundary. Many "real world" colors lie somewhat outside the Rec. 709/sRGB gamut, and there is a strong desire to display such colors in both professional applications (for example, to meet the gamut available in commercial offset printing and commercial photographic reproduction) and in consumer applications (because consumers like colorful pictures).

Figure 2 is taken from the work of the CIE TC08-5 committee (on gamut mapping).3 It shows the Rec. 709/sRGB gamut (the red wireframe), a gamut representative of real-world colors (the gray wireframe), and the intersection of the two (in full color). It is evident that there are plenty of real-world colors outside Rec. 709's capability. Even consumer ink-jet printers produce colors in certain regions of color space that are outside the sRGB gamut.

Some specialized closed systems have adopted primaries more highly saturated than Rec. 709/sRGB. For example, Adobe RGB 1998 is used in commercial photography and in graphic arts to enable designers to see the colors that will appear in finished materials printed by commercial color offset printing or reproduced by color-photographic techniques. Some cathode-ray-tube (CRT) displays and liquid-crystal displays (LCDs) having native Adobe 1998 RGB primaries have been commercialized for graphic-arts applications. As another example of wide-gamut primaries used in a specialized commercial application, the DCI P3 RGB primaries have been adopted for digital cinema. The DCI P3 system approximates the gamut of film.

Achieving Wide Gamut

Designing a three-channel display with an extended gamut involves moving the red, green, or blue primary – or more than one of them – outward toward the spectral locus shown in Fig. 1, thereby increasing the color saturation of the primary (or primaries) and covering a larger area of the chromaticity diagram.

For many emissive-display technologies, increasing the color saturation ("purity") of a primary can only be accomplished by reducing its spectral width. Reducing the spectral width often produces less light: There is ordinarily an engineering tradeoff between saturation and brightness. Light-emitting diodes (LEDs) intrinsically have relatively narrow spectral widths, so they do not suffer from this tradeoff; neither do lasers, which have infinitesimal spectral widths. Display products have already appeared in the marketplace that use LED and laser sources to achieve a wide color gamut.


Charles Poynton is an idependent contractor specializing in physics, mathematics, and engineering of digital color-imaging systems. He is involved in engineering wide-color-gamut systems, including xvYCC. He can be reached at 139 Robert St., Toronto, Ontario, M5S 2K6 Canada; telephone 416/535-7187, e-mail: poynton@poynton.com.

In display technologies that incorporate wideband illumination sources, moving the primaries toward the spectral locus requires discarding light in the regions of the spectrum between the spectral peaks of the desired primaries. For displays that use wideband sources – xenon lamps in projectors, for example – one approach to wide gamut is to split the light into four channels and to modulate the channels separately. Such a system produces colors lying in a quadrilateral in [x, y] space, rather than a triangle. This approach, which can be extended to five or even six channels, allows higher brightness than a three-channel system with the same illumination and the same gamut. However, a four- or five-channel optical system is much more complex than a three-channel system. Although four- or five-channel systems may find use in certain niche applications, for most applications, multi-primary techniques do not seem necessary; three well-chosen primaries will suffice.

Characterizing Gamut Coverage

Many members of the display community characterize the gamut of a display system by stating the area of the CIE [x, y] chromaticity diagram that it covers, relative to the area covered by the 1953 FCC "NTSC" system. There are three serious problems with such a specification.

• First, it is obvious from Fig. 2 that gamut is properly a volume measurement, not an area measurement. Gamut depends upon luminance, a dimension that is missing from a chromaticity diagram.

• Second, gamut should be expressed in a measurement that is approximately perceptually uniform. CIE [x, y] chromaticities fail to meet that requirement: An increment in x of 0.01 is far more perceptible near blue than a 0.01 increment near green. L*a*b*coordinates are much more perceptually uniform than [x, y, Y]. Ideally, gamut volume could be expressed in cubic ΔE units.

• Third, the "NTSC" gamut referred to by the display community was abandoned 40 years ago! Describing gamut as a "percentage of NTSC" suggests that the quoted figure is comparable to "NTSC" broadcast. One would expect "NTSC" to have 100% of "NTSC gamut," but in fact, today's "NTSC" – more accurately described as "480i" or "576i" SDTV – has only 72% of the gamut of "NTSC!" Confusion reigns.

The "percentage of NTSC" expression is firmly entrenched; there seems little chance of correcting the error. However, display engineers need to be aware of its deficiencies. Seyno Sluyterman wrote about some of these matters in a previous issue ofInformation Display.4 I fully agree with his views.

Gamut Mapping

It is obvious from Fig. 2 that if real world colors are to be represented in sRGB, or even if the full sRGB range is to be reproduced in real media, some color transformation will be necessary. For that region of color space that is common between the two media – the shared gamut – color science provides a straightforward way to transform device value so as to preserve chromaticity values. However, that simple-minded approach will clip colors that lie outside the destination space. Visible clipping artifacts are likely to be introduced. To maintain image quality through the transformation, it is necessary to alter colors even within the shared gamut. Such techniques, called gamut mapping, have been used for 10 or 15 years in graphics arts.

p11_eps

Fig. 1: The famous CIE chromaticity diagram is plotted. The horseshoe-shaped curve is the spectral locus of monochromatic (narrowband) sources as they sweep from extreme blue (around 400 nm) to extreme red (around 700 nm). No purple is exhibited in that sweep: Purple requires a mixture of blue and red; the dashed line joining blue and red is called the line of purplesAll colors are enclosed by those boundaries. The triangle encloses the Rec. 709/sRGB colors that are used in television and in computing. The 24 small squares plot the chromaticity coordinates of the colors of the Gretag-Macbeth ColorChecker card.

 

Creating gamut maps is a combination of science and craft. Gamut mapping for particularly important images may require human intervention to make decisions about which colors are important and which are not. Motion-picture film has a larger gamut, in most of the regions of color space, than Rec. 709 video. For many decades, cinema-tographers and colorists have performed manual gamut mapping on films being transferred to broadcast; VHS; and, now, DVD, HD-DVD, and Blu-ray discs.

Wide-Gamut Image coding

In traditional RGB image coding, each component value ranges from 0 to 1 on an abstract scale; in typical fixed-point integer encodings, data ranges from 0 to some maximum value (such as 255 in an 8-bit system). Gamma correction introduces a measure of perceptual uniformity into most encoding standards; prime symbols are used to denote gamma-corrected signals (R'G'B') as opposed to linear-light signals (RGB).

One way to encode wide-gamut imagery – and the approach taken for Adobe RGB – is to redefine the primaries. Combinations of R, G, and B between 0 and 1 now span a wider range of colors. This approach presents a compatibility issue: Legacy images will produce the wrong colors if decoded by the new scheme, and new images will produce the wrong colors if decoded by legacy systems. Image files or video streams can be augmented by metadata to identify the color space used, but legacy systems may fail to insert or interpret metadata. The primary-redefinition approach is feasible for closed, specialized systems, but is difficult to deploy widely in open systems.

p12_tif

Fig. 2: Real-world surface colors and sRGB/Rec. 709 colors are compared in this L*a*b* plot. The sRGB gamut is represented by the red wireframe and Pointer's real-world colors by gray. The full-color wireframe represents the intersection of the two gamuts. The L*a*b* coordinates of this plot are far more perceptually uniform than luminance and [x, y] chromaticities. This graphic is taken from the work of the CIE TC08-5 committee on gamut mapping.

 

Another way to encode wide-gamut imagery is to leave the primaries where they are and augment the coding range with negative data values and values above 1. Data values between 0 and 1 continue to be associated with their standard colors; wide-gamut colors have one or two component values below 0 or above 1. For example, a highly saturated cyan (turquoise) color can be encoded in a Rec. 709 derived wide-gamut system by combining positive green and blue values with a negative red value that effectively "sucks some red out," pushing the chromaticity outside the Rec. 709 triangle.

A few attempts have been made over the past decade or so to standardize signal excursions below 0 and above 1 to encode a wide color gamut. A decade ago, ITU-R adopted Rec. 1361 for studio video; it languished unused. The PIMA (now I3A) organization standardized e-sRGB for wide gamut in graphic arts; it similarly languished unused.

 

Chronology of Display Primary Standards
1953: NTSC color television was standardized by the FCC with quite a wide gamut. 1998: Adobe promulgated the Adobe RGB 1998 standard. Adobe attempted to include SMPTE 240M primaries in Photoshop version 5, but apparently a transcription error or cut and paste error in coding resulted in a garbled set of chromaticities being labeled SMPTE 240M. Adobe's error resulted in a system that combined Rec. 709 red and blue with the long-disused FCC 1953 NTSC green. By the time the error was discovered, there was some likelihood that users might have used the erroneous settings in stored documents; a bug fix that simply changed the primaries was precluded. Instead, a bug fix was issued that left the primaries alone but changed the name in the user interface from "SMPTE 240M" to "Adobe RGB 1998." Adobe RGB 1998 was born. Fortunately, by 1998, a saturated green primary was practical and useful.
1954–1963: Consumer-electronics (CE) manufacturers found that green phosphors meeting the FCC specification were slow and dim. Slowly, they migrated to green phosphors having higher speed and more brightness, but less color saturation. The FCC standard was not changed – it has not been changed to this day. 2000–2002: Many graphic-arts users adopted Adobe RGB. Monitors with native Adobe RGB primaries became commercially available. Adobe RGB 1998 color encoding was implemented in professional digital still cameras (D-SLRs).
1964: PAL color television was standardized in Europe with primaries representing NTSC practice at the time, including the somewhat desaturated green. Red and blue were altered slightly from the NTSC specifications. This standard, now controlled by the European Broadcasting Union (EBU), remains in use today for SDTV in Europe. 2002: The digital-cinema community developed the DCI P3 RGB standard, which approximates the gamut of motion picture film. DCI P3 RGB was (and continues to be) deployed in commercial digital cinema projectors.
1970–1983: North American broadcasters faced increasing pressure to find a studio standard that would display pictures as they were seen in consumers' premises. Studio monitors from a company called Conrac became available with phosphors close to those of consumer receivers of the time. Broadcasters unofficially adopted those monitors. 2004: Sony proposed the xvYCC scheme to enable carriage of wide-gamut video material through video Y′CBCR signal paths in broadcast, DVDs,etc.
1984: Two decades after adoption of the PAL standard, SMPTE finally adopted a studio standard (RP 145) representative of studio and consumer practice. It colloquially became known as "SMPTE-C," the C standing for Conrac. 2006: IEC adopted the xvYCC standard, denoted IEC 61966-2-4.
1988: SMPTE incorporated the SMPTE C primaries into the SMPTE 240M standard for developmental HDTV that was then called 1125/60 HDTV (now denoted 1035i30). 2006–2007: Broadcasters worldwide continued to "grade" and approve program material on CRTs; however, fewer and fewer consumers experienced their programs on CRTs. Broadcasters began to realize that new studio standards reflecting a diversity of display technologies would become necessary.
1990: The CCIR, now called ITU-R, adopted Recommendation BT.709 ("Rec. 709") primaries for HDTV. Rec. 709's red and blue chromaticities were those of the EBU standard that codified the original 1964 PAL system; green was agreed upon as the arithmetic average, rounded to two digits, of EBU green and SMPTE C green – truly an international compromise. 2007: Sony introduced a professional studio LCD monitor, having an LED backlight, with native DCI P3 primaries and built-in colorimetric transforms for all commercially important video standards.
1994: The computer industry, led by Microsoft and Hewlett-Packard, adopted the sRGB standard, incorporating the Rec. 709 primaries. Moderate gamut for RGB was thereby unified around the world and across entertainment and computing. Soon afterward, the emergent digital-still-camera industry incorporated sRGB into their Exif standard.    

Making use of the wide-gamut encoding requires that image-capture devices be equipped with appropriate color transforms and that receivers be equipped with appropriate transforms. The idea is to standardize the interchange space. Devices at either end of the distribution chain can have different gamuts provided that local color transforms are implemented.

Wide Gamut for Consumer Video: xvYCC

Entertainment imagery – such as computer games; movies on DVD, HD-DVD, and Blue-ray; and even broadcast images – is likely to be the driving force that enables wide-gamut color to take hold in high-volume markets. The xvYCC standard is a wide-gamut encoding intended for such applications.

Video cameras capture RGB components. However, since the analog era, signal processing in the studio, in recording, and in transmission has been based upon luma (representative of lightness) and two color-difference (chroma) components derived fromR'G'B'. The color difference components are band-limited to reduce detail (to which the eye is relatively insensitive, relative to luma detail). In digital systems, the components are denoted Y ′CBCR. (The lightness component is called luma; unlike luminance, it is not proportional to intensity.)

Figure 3, taken from my book Digital Video and HDTV Algorithms and Interfaces5 depicts the result of "matrixing" RGB′ into Y′CBCR. The R′G′B′ prism occupies the central region of the Y′CBCR cube. Y′ occupies the range 0–1 (in abstract terms); CB and CR signals occupy the abstract range ±0.5.

It is clear from Fig. 3 that the R′G′B′ prism occupies only one-quarter of the volume of Y′CBCR space. However, CB and CR must each have the full ±0.5 range available because component values across that whole range are required to encode some colors at some luma levels.

The approach of encoding wide gamut through excursions of R′G′B′ below 0 and above 1 is easily adapted to Y′CBCR: Wide-gamut colors correspond to regions of Y′CBCR space that are outside the R′G′B′ prism, but still within Y′CBCR limits. Wide-gamut signals encoded in this manner can be conveyed through today's SDTV and HDTV signal-processing infrastructure, including recording, broadcasting, and even MPEG and H.264 compression.

A naive application of the scheme described here would cause clipping if new, wide-gamut material were to be displayed on legacy receivers. However, it would be completely impractical to develop a new, parallel distribution infrastructure specialized for wide-gamut content; this "extended video" (xv) approach is a reasonably good starting point for wide gamut in consumer entertainment.

Sony proposed the scheme to the IEC (apparently without soliciting contributions from content creators), and xvYCC was standardized.6 Sony and several other manufacturers have demonstrated consumer-electronics devices with xvYCC capability, and a few xvYCC models are already in the consumer marketplace.

The recently adopted HDMI 1.3 standard for digital interface to displays has provision for xvYCC-encoded signals. However, no gamut-mapping algorithms are standardized, and no guidelines for gamut mapping are provided. HDMI 1.3 allows – perhaps even encourages – gamut mapping at the display itself, but in my view, gamut mapping should be performed in a receiver or player, not in the display itself.

p14_eps

Fig. 3: The R′G′B′ unit cube, when subject to the luma-chroma transform, forms a prism in Y′CBCR space. The transform enables subsequent subsampling (bandwidth reduction) of the chroma components; the loss in color detail is unnoticed by the viewer.

Conclusion

Wide-gamut display hardware has arrived. We can expect rapid deployment of these displays in specialized professional applications. Introducing wide-gamut displays to specialized, closed industries presents challenges, but these challenges are tractable.

Consumer-electronics manufacturers have an incentive to extend the gamut of consumer display devices because more colorful displays will have an advantage in the marketplace! However, introducing wide gamut to an open-systems environment is an order of magnitude more difficult than for a closed system. The challenge is to decide where and how gamut mapping should be performed.

In entertainment film and video, gamut-mapping decisions are made very early in the production process. Color decisions are approved on monitors having a reasonably close approximation of eventual consumer gamut. Early commercially available xvYCC equipment achieves wide gamut by mapping legacy colors into it; gamut mapping lies outside the control of the creative community. Proprietary gamut-mapping strategies are built into the consumer-electronics equipment, and different manufacturers are likely to produce their own gamut maps. Such developments will almost certainly be viewed as disastrous by the creative community. I call this approach wild gamut, not wide gamut!

The consumer wide-gamut marketplace will be enabled by professionally created wide-gamut imagery. An end-to-end gamut-mapping strategy needs to be established to enable the xvYCC standard to display the colors that were intended by the content creators. Without such a strategy, content creators will lack confidence in display of xvYCC-encoded material: They can be expected to shun xvYCC encoding – and refuse to originate true wide-gamut material – until they believe that xvYCC equipment will display their intended colors reasonably accurately.

References

1ITU-R Recommendation BT.709-5, Basic parameter values for the HDTV standard for the studio and for international program exchange. 
2IEC 61966-2-1, Multimedia systems and equipment – Colour measurement and management – Part 2-1: Colour management – Default RGB colour space – sRGB.
3CIE TC8-05 Committee on Gamut Mapping, Color Encoding Criteria, available at <www.colour.org/tc8-05/Docs/colorspace/ Metrics06.pdf>. 
4S. Sluyterman, "The NTSC Color Triangle Is Obsolete, but No One Seems to Know," in Information Display 22, No. 5, 84–85 (May, 2006).
5C. Poynton, Digital Video and HDTV Algorithms and Interfaces (Morgan Kaufmann, San Francisco, CA, 2003).
6IEC 61966-2-4, Multimedia systems and equipment – Colour measurement and management – Part 2-4: Colour management – Extended-gamut YCC colour space for video applications – xvYCC.