The Note is a gap analysis document. It identifies the next steps for enabling Wide Color Gamut (WCG) and High Dynamic Range (HDR) on the Open Web Platform.

Introduction

The initial commercial application of Color Science (principally colorimetry, rather than color appearance modelling) to the reproduction of color was concerned with surface colors such as painted or printed objects, lit by a single illuminant. The achievable luminance range was thus constrained to the luminance of a perfect diffuse reflector (practically, an unprinted sheet of good paper) at the high end, and the densest printed or painted black that could be achieved without the paper tearing or deforming, at the darkest end. This produced luminance ranges from as low as 20:1 to as high as 90:1 (Fairchild, p.399 [[Fairchild-CAM]]). The achievable Chroma range was also limited by the vibrancy of mass-produced paints and inks, and the prohibitive cost of using additional spot colors to extend the gamut. Self-luminous displays, in that commercial environment, were primarily used at low luminances for soft proofing of an eventual printed product, and were intended to replicate the appearance of a viewing booth.

Over time, self-luminous displays were seen as a worthy color management target in their own right: for computer displays, for the display of digital photography, for digital cinema, and for the consumption of media such as television and movies in the home. The luminance range increased: media consumed in near-dark environments such as the cinema could have much deeper blacks, while media consumed in dim to normal viewing environments enjoyed an increased peak white luminance of 200 to 300 cd/m2. The darkest blacks in such environments were constrained by viewing flare, which is why the sRGB standard mandated a viewing flare of 5%, typical for glossy glass CRT displays of the time; coupled with a peak luminance of only 80cd/m2 the luminance range was still relatively modest. The introduction of matte LCD and OLED screens, and the commercial success of HDTV, increased the dynamic range somewhat, but it was still comfortable encoded in 8 bits per gamma-corrected component. This is referred to as Standard Dynamic Range (SDR). The achievable Chroma range for sRGB and Rec.709 HDTV, although extending beyond that of printed material at the primaries, was also relatively modest. Wider gamut displays were available professionally, covering most of the Adobe 1998 RGB™ color space, but were typically not found in a home or office environment.

Commercial availability of displays covering the Display P3 gamut (a derivative of DCI P3 used for digital cinema, but altered to account for a non-dark viewing environment, with an sRGB transfer curve, and D65 white point) meant that Wide Color Gamut (WCG) displays became commonplace on laptops, external monitors, mobile phones, and even watches. WCG requires a modest increase in the number of bits per gamma-corrected component, from 8 to 10 or 12 (for the widest gamut in common use for content delivery, Rec.2020).

The deployment of 4K and then 8K television, accompanied by digital content delivery, brought not only a similarly increased color gamut to the media space, but also a greatly increased dynamic range of 4000:1 or more. Increased phosphor efficiency and purity, together with bright and modulatable backlights, brought High Dynamic Range (HDR) into widespread use. Computers, however, remained limited to SDR for the most part.

The human visual system can, with appropriate adaptation, function over an enormous luminance range — from starlight or moonlight at 0.01 cd/m2 and 0.1 cd/m2, through the dim lighting at dawn and dusk (10 cd/m2), office lighting (100 cd/m2), modern displays (800 cd/m2), overcast daylight (1,000 cd/m2), bright daylight (10,000 cd/m2) and full direct sunlight (100,000 cd/m2). However, the full range cannot be simultaneously perceived in the same scene at the same time.

There are two main systems defined for HDR video: Hybrid Log Gamma (HLG), developed by BBC and NHK, and Dolby Perceptual Quantizer (PQ). While improvement in video quality has driven the innovation of HDR, support for content on the web more generally, e.g., for static images, the <canvas> element, and in CSS in general, is still needed.

Add a brief description of PQ and HLG approaches, including: use of metadata, absolute vs relative brightness, proprietary vs open standard, etc.

The BBC has published a frequently-asked questions document [[hdr-hlg-faq]] that gives a high level introduction to HDR, and the PQ and HLG solutions.

Fredrik Hubinette from Google has written a useful document [[hdr-chrome]] that discusses the issues with presenting, and in particular compositing, SDR and HDR content. This was presented at TPAC 2017. It considers both PQ and HLG. (minutes of the TPAC 2017 meeting)

On the web, SDR, and HDR content using both HLG and PQ is expected to coexist, potentially within the same page, as authors can include arbitrary content in their web pages. An example is a page that contains multiple embedded videos, or their poster images. This raises the question of how to composite content with different color spaces and dynamic range encodings.

Goals

Support for HDR and WCG on the web is important to allow color and luminance matching between HDR video content and surrounding or overlaid graphic and textual content in web pages.

Some specific goals include:

There are a number of specifications potentially impacted by HDR. One of the purposes of this document is to identify all documents that are possibly affected, so that we can review them and determine any changes needed.

Specifying Colors in Web Pages

CSS Color Module

CSS defines several ways of specifying the colors to use in web pages, in the CSS Color Module specifications. The current Recommendation is Level 3 [[css-color-3]], and its successor, Level 4, is becoming stable and starting to be implemented [[css-color-4]]. The various methods in Level 4 cover WCG but not HDR, and are:

Items marked (*) are new in Level 4. rgb(), named colors, HSL and HWB all resolve to sRGB. The others are WCG. To date, all of these are SDR color spaces.

Section 12 in [[css-color-4]] is titled "Working Color Space", and currently empty. It is intended to cover compositing multiple color spaces. There is as yet no consensus on whether this should be per-document or a per-element property; nor on whether a single value is appropriate for all operations. We need to define how should compositing work, if there is HLG, PQ, and SDR content present. The [[hdr-chrome]] discussion document provides useful input. This will require defining where black, media/paper white, and peak whites (full-screen and small-area highlights) map in the various spaces.

The draft CSS Color Module Level 4 [[css-color-4]], adds WCG support. Rec. 2020 is covered, but only for SDR. The range of CIE Lightness (in Lab and LCH) is not constrained to 100, and a figure of 400 is mentioned because it is known that Lab is a fair model for up to four times the luminance of diffuse white (Fairchild, pp.403-413 [[Fairchild-CAM]]), but no meaning is yet ascribed for values greater than 100.

Rec. 2100, with both HLG and Dolby PQ is added to CSS in an Unofficial Draft [[css-color-hdr]]. It also adds Jzazbz (which uses the PQ transfer characteristic), JzCzhz (the polar form), and Dolby ICtCp. Lastly, it adds a (normative) section "Compositing SDR and HDR content".

Other CSS specifications assume that all colors are sRGB. For example, CSS Images (which defines linear, radial, and conic gradients) assumes that all color stops are sRGB colors and mandates that interpolation happen as premultiplied, gamma-encoded, sRGB values.

Open Issues

There are a few open issues that relate to HDR and WCG support:

Canvas API

The HTML <canvas> element provides web applications with a resolution-dependent bitmap canvas, which can be used for rendering graphs, game graphics, art, or other visual images on the fly. The <canvas> element supports either 2D or WebGL rendering.

The 2D API [[canvas-2d]] defines primitive operations for drawing graphics in to a <canvas> element, such as drawing lines, shapes, and text.

The "Color managing canvas contents" proposal [[canvas-colorspace]] attempts to address the following use cases:

  • Content displayed through a <canvas> element should be color managed in order to minimize differences in appearance across browsers and display devices. Improving color fidelity matters a lot for artistic uses (e.g., photo and paint apps) and for e-commerce (product presentation).
  • Canvases should be able to take advantage of the full color gamut and dynamic range of the display device.
  • Creative apps that do image manipulation generally prefer compositing, filtering and interpolation calculations to be performed in a linear color space.

The proposal [[canvas-colorspace]] is not currently HDR capable <canvas>; the proposal adds WCG support. See Blink Intent to Ship, TAG review, and this issue against the HTML standard.

Open Issues

Merging of a recent pull request indicates the canvas-colorspace proposal might move forward again. In particular, color spaces are now defined relative to CSS Color 4 predefined RGB spaces, instead of being assumed to be sRGB.

Filter Effects

The CSS Filter Effects Module [[filter-effects]], which generalizes to CSS and HTML the filter effect already available on SVG, performs all operations in linear-light sRGB color space. For example, the luminanceToAlpha definition hard-codes the matrix operations for linear-sRGB to Luminance. Operations are defined in terms of an equivalent SG filter function.

It would be desirable to extend the filter operations to at least other RGB spaces, such as display-p3; provided this can be done in an opt-in manner, without Web-incompatible changes to existing content. Currently, the SVG WG is focussed on documenting existing implementations rather than extending SVG.

Open Issues

HTML color input

The HTML input element has a color type, which allows the user to pick or otherwise indicate a color. The picker is constrained to produce what HTML calls a 'simple color' such as '#123456'. This is limited to sRGB, and 8 bits per component precision.

A simple color consists of three 8-bit numbers in the range 0..255, representing the red, green, and blue components of the color respectively, in the sRGB color space.

Open issues

Device Capabilities and APIs

Media Capabilities API

Media Capabilities [[media-capabilities]] is a draft specification being developed by the Media Working Group. It intends to provide APIs to allow websites to make an optimal decision when picking audiovisual media content for the user. The APIs will expose information about the decoding and encoding capabilities for a given format but also output capabilities to find the best match based on the device's display.

The API is a replacement for the existing canPlayType() function in HTML [canplaytype], and isTypeSupported() in Media Source Extensions [[media-source]].

The Introduction document [media-capabilities-intro] gives a good explanation of the problem that this API is trying to solve.

Open Issues

There are a number of open issues where the Color on the Web CG could usefully provide input.

CSS Media Queries

Media Queries 4 [[mediaqueries-4]] allow authors to test and query values or features of the browser or display device, independent of the document being rendered. They are used in the CSS @media rule to conditionally apply styles to a document, and in various other contexts and languages, such as HTML and JavaScript. Section 6.4 describes the color-gamut feature, which describes the approximate range of colors that are supported by the browser and output device.

Media Queries 5 [[mediaqueries-5]] also allows authors to test and query the ambient light level (dim | normal | washed), although the assumption seems to be that dim means "excessive contrast and brightness would be distracting or uncomfortable" rather than "standard viewing conditions for HDR content". The enumeration is, deliberately, not tied to specific light levels, although ambient light sensors are mentioned.

Media Queries 5 also adds dynamic-range and video-dynamic-range features, both with the values standard and high, for SDR and HDR respectively. The two queries can return different results in the case of, for example, a 4K TV with HDR video plane and SDR content overlay plane.

Open Issues

CSSOM View Module

The CSS Object Model View Module [[cssom-view-1]] has a screen interface with a colorDepth attribute which typically returns the value 24 for an 8 bits-per-component display. An example in the specification combines colorDepth (with a value of 48!) with a Media Query for the p3 gamut and unspecified "other checks" to probe for an HDR screen.

Open Issues

No open WCG or HDR issues. However, CSSWG decided that the color-related aspects of the CSS OM should migrate from that specification to CSS Color 4. That migration is in progress, with CSS Color 4 defining serialization of WCG color spaces.

CSS Typed OM Level 1

The CSS Typed OM Level 1 [[css-typed-om]] is an extensible API for the CSSOM that reduces the burden of manipulating a CSS property's value via string manipulation. It does so by exposing CSS values as typed JavaScript objects rather than strings. There is an explainer document.

Open Issues

There is current discussion on extending the CSS Typed OM for WCG and perhaps HDR color. Helper functions to calculate luminance contrast (for WCAG) are also in scope.

There is initial prototyping work for an object-oriented, typed, CSS Color model.

Timed Text

Timed Text Markup Language [[TTML2]] has a tts:color attribute whose type is defined to be a value in sRGB, with 8-bit percomponent precision. Titles in TTML are thus SDR.

However, subtitles are composited onto video content and TTML does provide a (non-normative) Appendix Q High Dynamic Range Compositing which gives equations for converting the sRGB values into either PQ or HLG for compositing onto HDR video.

Realizing that sRGB is defined to have a peak luminance of 80 cd/m2 in theory, and a somewhat higher value in practice, TTML also provides a tts:luminanceGain attribute to boost that value for systems (such as PQ) which require an absolute luminance value.

Open Issues

Static Image Formats

Requirements

A static image format is required to store graphics (both camera-capture and artificially generated). Suggested use cases include, but are not limited to:

Any such graphics format would need to:

Knowledge of the image primaries allows correct display of color on all displays, including displays which would need to apply a gamut reduction to the image.

ICC Profiles store data which allows the correct mapping of captured data to/from an all encompassing color space and from/to the all encompassing color space to a display device. Version 4 has only a D50 based color space and lookup-table based transforms. Version 5 allows different color working spaces and an algorithmic calculator for the transforms. Version 5 is not well supported.

Open questions: How can we define an ICC / ICCMax profile for HLG? If we define an ICC / ICCMax profile for HLG, how can we test it? Lars Borg has written an HLG profile.

Netflix have published a blog post [[netflix-hdr]] that describes their approach for static images.

HDR-oriented image-formats-comparision

Candidates

PNG

Portable Network Graphics (PNG) can store graphics using 16-bits per channel and it is possible to store the primaries and white point. Two methods of storing the transfer function exist: the storage of a value for gamma, and the storage of an ICC profile. The first method, storage of gamma would allow a backwards compatible image to be displayed on non-HDR monitors. The second would allow the correct display of this image on an HDR display. A new, HDR-specific tag (with values "HLG" and "PQ") could be a third option

The Timed Text Working Group has published a draft specification [[png-hdr-pq]] that extends the PNG format to include an iCCP chunk with the name "ITUR_2100_PQ_FULL". The chunk contains a given ICC profile, linked from the specification, although this profile is not actually used, and only its presence is used to enable external rendering of the PQ encoded image by the platform. See this discussion on the Color on the Web CG mailing list.

AVIF

Mozilla bug AVIF (AV1 Image File Format): ICC profile support

WebGL

TODO: (might be moot, see WebGPU)

Open Issues

WebGPU

The GPU on the Web Working Group develops the WebGPU API [[webgpu]], a more modern version of WebGL. The draft spec does not mention "color spaces". It has an RGBA color object, which consists of four double-precision floats. It therefore has the potential to cover HDR, as well as WCG; but the meaning of the values, their range, transfer function, and color space is currently undefined. However, a passing mention of the naming convention for texture formats seems to indicate that both gamma-corrected and linear sRGB are supported, with conversion on-the-fly.

Open Issues

Recommendations

TODO: ...