Well-deployed technologies

Few media-based services can usefully function without rendering audio or video content; the HTML5 specification provides widely deployed support for this essential feature. Video content can be rendered in any Web page via the <video> element.

Likewise, audio content can be rendered in any Web page via the <audio> element.

Beyond the declarative approach enabled by the <audio> element, the Web Audio API provides a full-fledged audio processing API, which includes support for low-latency playback of audio content.

WebVTT is a file format for captions and subtitles. The specification is still a Working Draft, but the format is already supported at various levels among browsers, allowing to render text tracks through a <video> element. The Timed Text Markup Language (TTML) specification provides a richer language for describing timed text. It is used both as an interchange format among authoring systems and for delivery of subtitles and captions worldwide, in particular through profiles such as the IMSC1 (Internet Media Subtitles and Captions) profile. Some browsers may not support IMSC1 natively, but note Web applications can still take advantage of IMSC1 through libraries such as the imscJS polyfill library, which is a complete implementation of the IMSC1 profile in JavaScript and renders IMSC1 documents to HTML5.

For the distribution of media whose content needs specific protection from copy, Encrypted Media Extensions (EME) enables Web applications to render encrypted media streams based on Content Decryption Modules (CDM).

Users often want to share a pointer to a specific position within the timeline of a video (or an audio) feed with friends on social networks, and expect media players to jump to the requested position right away. The Media Fragments URI specification defines a syntax for constructing media fragment URIs and explains how Web browsers can use this information to render the media fragment.

FeatureSpecification / GroupMaturityCurrent implementations
Select browsers…
Video renderingvideo element in HTML Standard
WHATWG
Living Standard
Audio renderingaudio element in HTML Standard
WHATWG
Living Standard
Web Audio API
Audio Working Group
Candidate Recommendation
Rendering of captionsWebVTT: The Web Video Text Tracks Format
Timed Text Working Group
Candidate Recommendation
Timed Text Markup Language 1 (TTML1) (Third Edition)
Timed Text Working Group
Recommendation
TTML Profiles for Internet Media Subtitles and Captions 1.0.1 (IMSC1)
Timed Text Working Group
Recommendation

Polyfills

Rendering of protected mediaEncrypted Media Extensions
HTML Media Extensions Working Group
Recommendation
Rendering of media fragmentsMedia Fragments URI 1.0 (basic)
Media Fragments Working Group
Recommendation

Specifications in progress

The Timed Text Markup Language 2 (TTML2) specification extends TTML1 with advanced features for animations, styling, embedded content and metadata. The IMSC1.1 profile, backwards compatible with the IMSC1 profile, is based on TTML2.

As users increasingly own more and more connected devices, the need to get these devices to work together increases as well:

  • The Presentation API offers the possibility for a Web page to open and control a page located on another screen, opening the road for multi-screen Web applications.
  • The Remote Playback API focuses more specifically on controling the rendering of media on a separate device.
  • The Audio Output Devices API offers similar functionality for audio streams, enabling a Web application to pick on which audio output devices a given sound should be played on.

Wide-gamut displays are becoming more and more common; The CSS Media Queries level 4 specification includes means to detect detect wide-gamut displays and adapt the rendering of the application to these improved color spaces.

The WebXR Device API specification is a low-level API that allows applications to access and control head-mounted displays (HMD) using JavaScript and create compelling Virtual Reality (VR) / Augmented Reality (AR) experiences. It is a critical enabler to render 360° video content in Virtual Reality headsets.

FeatureSpecification / GroupMaturityCurrent implementations
Select browsers…
Rendering of captionsTimed Text Markup Language 2 (TTML2)
Timed Text Working Group
Recommendation
TTML Profiles for Internet Media Subtitles and Captions 1.1
Timed Text Working Group
Recommendation
Distributed renderingPresentation API
Second Screen Working Group
Candidate Recommendation
Remote Playback API
Second Screen Working Group
Candidate Recommendation
Audio Output Devices API
WebRTC Working Group
Candidate Recommendation
Rendering in different color spacescolor-gamut media query in Media Queries Level 4
CSS Working Group
Candidate Recommendation
Rendering in VR/AR headsetsWebXR Device API
Immersive Web Working Group
Working Draft

Exploratory work

Providing an alternative transcript to media content is a well-known best practice; a transcript extension to HTML has been proposed to make an explicit link between media content and their transcript and thus facilitate discovery and consumption.

The Multi-Device Timing Community Group is exploring another aspect of multi-device media rendering: its Timing Object specification enables to keep video, audio and other data streams in close synchrony, across devices and independently of the network topology. This effort needs support from interested parties to progress.

The Picture-in-Picture proposal would allow applications to initiate and control the rendering of a video in a separate miniature window that is viewable above all other activities.

To improve the interoperability of implementations of the Presentation API and Remote Playback API, in particular between the first and second screen, the Second Screen Community Group is discussing requirements for an Open Screen Protocol.

To adapt to wide-gamut displays, all the graphical systems of the Web will need to adapt to these broader color spaces. CSS Colors Level 4 is proposing to define CSS colors in color spaces beyond the classical sRGB. Similarly, work on making canvas color-managed should enhance the support for colors in HTML Canvas.

FeatureSpecification / GroupImplementation intents
Select browsers…
Rendering of captionsA transcript extension for HTML
HTML Working Group Accessibility Task Force
Distributed renderingTiming Object
Multi-Device Timing Community Group
Picture-in-Picture
Web Platform Incubator Community Group
Open Screen Protocol
Second Screen Community Group
Rendering in different color spacesprofiled device-dependent colors in CSS Color Module Level 4
CSS Working Group
Color managing canvas contents

Features not covered by ongoing work

Color Management
To ensure the proper rendering of videos with high-dynamic range (HDR) and wide-gamut colors, content providers would need to determine whether the underlying device and browser have proper support for this. Similarly, content providers need a mechanism to match colors to mix HDR content and Standard Dynamic Range (SDR) content. The Color on the Web Community Group allows color experts from various fields to share ideas and discuss technical solutions to improve the state of Color on the Web.
Native support for 360° video rendering
While it is already possible to render 360° videos within a <video> element, integrated support for the rendering of 360° videos would allow to hide the complexity of the underlying adaptive streaming logic to applications, letting Web browsers optimize streaming and rendering on their own.
Extensions to Encrypted Media Extensions (EME)
Various extensions the Encrypted Media Extensions specification have been proposed, including defining a virtual environment in which CDMs can run to improve CDM portability across operating systems, support for continuous key rotation, mappings between EME and underlying DRM-specific security levels, and protection of media content when played in a VR headset. The HTML Media Extensions will maintain the specification not develop new features, which should be incubated in the Web Platform Incubator CG (WICG).

Discontinued features

Network service discovery
The Network Service Discovery API was to offer a lower-level approach to the establishment of multi-device operations, by providing integration with local network-based media renderers, such as those enabled by DLNA, UPnP, etc. This effort was discontinued out of privacy concerns and lack of interest from implementers. The current approach is to let the user agent handle network discovery under the hoods, as done in the Presentation API and Remote Playback API.
WebVR
Development of the WebVR specification that allowed access and control of Virtual Reality (VR) devices, and which is supported in some browsers, has halted in favor of the WebXR Device API, which extends the scope of the work to Augmented Reality (AR) devices.