Well-deployed technologies
Touch-based interactions
An increasing share of mobile devices relies on touch-based interactions. While the traditional interactions recognized in the Web platform (keyboard, mouse input) can still be applied in this context, a more specific handling of touch-based input is a critical aspect of creating well-adapted user experiences, which Touch Events in the DOM (Document Object Model) enable.
The Pointer Events specification defines a single model for mouse, touch and pen events. It provides a complementary and more unified approach to Touch Events. It includes the CSS property touch-action
that allows to filter gesture events on elements and that is widely implemented across browsers.
Vibration
The Vibration API lets mobile developers take advantage of haptic feedback to create new forms of interactions (e.g. in games).
Notification
Mobile devices follow their users everywhere, and many mobile users rely on them to remind them or notify them of events, such as messages: the Web Notifications specification enables that feature in the Web environment.
Accessibility
The User Agent Accessibility Guidelines (UAAG) 2.0 note defines principles and guidelines for user agents to design an accessible user agent interface and communicate with assistive technologies. The supporting document UAAG 2.0 Reference explains the intent and best practices of UAAG 2.0 success criteria, and lists numerous examples for each of them. Examples that are directly targeted at mobile devices are summarized in the Mobile Accessibility Examples from UAAG 2.0 Reference.
Following the Web Content Accessibility Guidelines (WCAG) 2.1 will make content accessible to a wider range of people with disabilities. The 2.1 revision adds new success criteria and guidelines to version 2.0, including new criteria that have a specific resonance in mobile contexts, such as the Pointer Gestures, Target Size and Orientation criteria.
Web content developers may benefit from authoring tools that follow the Authoring Tool Accessibility Guidelines (ATAG) 2.0 standard, which provides guidelines for designing Web content authoring tools that are both more accessible to authors with disabilities and that help design content that conforms to WCAG.
The Mobile Accessibility note explains how WCAG and other accessibility guidelines can be applied to mobile Web applications, as well as to native applications and hybrid applciations using Web components inside native applications.
The Accessible Rich Internet Applications (WAI-ARIA) 1.1 standard provides an ontology of roles, states, and properties that define the semantics of user interface elements and that can be used to improve the accessibility and interoperability of Web content and applications. The Core Accessibility API Mappings 1.1 standard describes how user agents should expose these semantics to accessibility APIs — unfortunately, mobile platforms do not yet have fully comprehensive accessibility API mappings and these mappings are only meaningful on desktop platforms for now.
Technologies in progress
Game controllers
The Gamepad specification defines a low-level interface that exposes gamepad devices attached to the browsing device, such as those paired with a smartphone via Bluetooth.
Smooth scrolling
As more and more content gets rendered as long scrollable lists, more and more logic is attached to scrolling events, and the quality of the user experience of these actions is highly dependent on their performances. The CSSOM View Module determines when scrolling events get fired, and let developers specify the type of scrolling behavior they want.
The proposed work on CSS Scroll Snap Points adds greater ability to control the behavior of panning and scrolling by defining points to which an app view would snap when the user moves through the page.
The CSS will-change
property is also available to indicate to browsers that a given part of the page will be soon scrolled to and should be pre-rendered.
Notification
The Push API makes it possible for server-side notifications to alert the user, even if the browser is not running.
The Badging API defines a more subtle notification mechanism than Web Notifications, allowing Web applications that have been installed on the device (e.g. through a manifest file) to set an application-wide badge, typically shown next to the application's icon on the home screen, to notify the user when the state of the application has changed and might require their attention (e.g. a new message has arrived).
Screen wake
Whether users are speaking commands to their apps or working with them through non-haptic interactions, they risk seeing the screens turned off automatically by their devices screensaver. An early proposal for a Screen Wake Lock API would let developers signal the needs to keep the screen up in these circumstances.
Exploratory work
Speech-based interactions
Mobile devices, and mobile phones in particular, are also in many cases well-suited to be used through voice-interactions; the Speech API Community Group developed a JavaScript API to enable interactions with a Web page through spoken commands. Speech synthesis is well supported across browsers. Support for speech recognition is still underway.
Input method
The Input Method Editor (IME) API provides Web applications with scripted access to an IME (input-method editor) associated with a hosting user agent. Editorial support is required for this specification to move forward.
Touch-based interactions
The proposal for an Input Device capabilities API would provide information about whether a given “mouse” event comes from a touch-capable device.
Game controllers
Gamepads exist in a variety of flavours, from usual console gamepads to custom devices such as guitars, pedals, dance pads, scratching gamepads, magic wands, or VR/AR controllers. Each of these devices has its own inputs that need to be mapped into native input APIs for them to appear to Web applications, and outputs (LEDs, vibration, etc.) that are mostly unavailable on the Web. WebHID proposes to expose the HID protocol that most of these devices use under the hoods to Web applications, allowing them to support the long tail of HID devices through JavaScript-based logic when the browser lacks support. Looking forward, this proposal could also ease integration between smartphones and all sorts of tangible user interfaces.
Responsiveness
Applications that need to run long tasks (e.g. when pages are loaded) need to make a tradeoff between loading pages quickly (or reducing task execution duration) and responding to input quickly. The Early detection of input events specification proposes an isInputPending
method that long running scripts can call synchronously, without losing time yiedling to other scripts and events processing, to detect whether there are pending input events that their execution might delay from firing.
Input latency
Today, all DOM events need to go through the main thread, which is the only one who has access to DOM elements. The Input for Workers and Worklets specification proposes an event delegation scheme to workers that assumes no DOM access. This mechanism would enable latency sensitive event dependent logic, which would no longer be blocked by the main thread. Use cases include drawing on an OffscreenCanvas
, cloud scenarios (e.g. cloud gaming) where input events need to be forwarded to a server with minimal latency for processing, interactive animations in conjunction with CSS Animation Worklet where animation is driven by user input, and interactive audio in conjunction with AudioWorklet
.
Features not covered by ongoing work
- Gesture events
- As mentioned above, touch-based interaction is common on mobile devices and available to Web applications through Touch events. Gesture-based interaction, which includes pinching, rotating and swiping, is also a common interaction paradigm on mobile devices. Web developers may derive gesture events from touch events to some extent, but may have to develop multiple versions for different browsers. Native support for these interactions would reduce fragmentation and improve performance. Early discussions to define Gesture events have started in the Merging of Web and Mobile APP Community Group.
Discontinued features
- Intent-based events
- As the Web reaches new devices, and as devices gain new user interactions mechanisms, it seems useful to allow Web developers to react to a more abstract set of user interactions: instead of having to work in terms of “click”, “key press”, or “touch event”, being able to react to an “undo” command, or a “next page” command independently of how the user instructed it to the device. The IndieUI Events specification was an attempt to address this need. The work has been discontinued for now, due to lack of support from would-be implementers.