Abstract

The Touch and Pointer Guideline provides new success criteria, techniques and failures that are an addition to Web Content Accessibility Guidelines (WCAG) 2.0. This does not replace WCAG 2.0.

Status of This Document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at http://www.w3.org/TR/.

Status

This document is the internal working draft used by the Mobile Accessibility Task Force and is updated continuously and without notice. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This is a draft of the Touch and Pointer Guideline, which catalogs a new mobile guideline on Touch and Pointer, its success criteria, Understanding entries, and Techniques proposed by the Mobile A11y Task Force of the WCAG Working Group.

When complete, this document is intended to become a normative extension of Web Content Accessibility Guideline (WCAG 2.0) [WCAG20]. This is part of a series of technical and educational documents published by the W3C Web Accessibility Initiative (WAI).

The Mobile Accessibility Task Force would particularly appreciate feedback on the following success criteria:

  1. Success Criteria 2.5.4 Target Size. Is it appropriate to have two different sizes for touch vs. pointer? Are the minimum sizes being proposed sufficient for people with dexterity or fine motor coordination disabilities?

This document was published by the Mobile Accessibililty Task Force as an Editor's Draft. If you wish to make comments regarding this document, please send them to public-mobile-a11y-tf@w3.org (subscribe, archives) or create a Github issue at https://github.com/w3c/Mobile-A11y-Extension/issues. All comments are welcome.

Publication as an Editor's Draft does not imply endorsement by the W3C Membership. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This document was produced by a group operating under the 5 February 2004 W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 1 September 2015 W3C Process Document.

Introduction

The document provides guidance to improve accessibility for people with disabilities. While it generally applies to traditional mobile devices, it also applies to touch-enabled desktop devices, kiosks, tablets and other platforms that use technology beyond the traditional mouse and keyboard. While it is primarily oriented toward web and hybrid content, the guidelines and success criteria may also apply to native mobile applications.

Structure of this Document

This version of the Touch and Pointer Accessibility Guideline is closely tied with WCAG. The early work of the task force focused on writing Techniques for WCAG that applied to mobile. Many existing WCAG techniques apply to mobile in current form and are listed in the Appendix of the W3C Working Group Note: Mobile Accessibility: How WCAG 2.0 and Other W3C/WAI Guidelines Apply to Mobile.

This new guideline, success criteria, and techniques are not in WCAG 2.0. It is being determined by the WCAG Working Group how this will be incorporated into WCAG 2.0. For convenience of discussion, the proposed Touch and Pointer Guideline has been tentatively numbered as 2.5, since the Task Force proposes that it belongs with WCAG Principle 2: Operable.

Guideline 2.5: Touch and Pointer: Make it easier for users to operate touch and pointer functionality.

Intent of Guideline 2.5

[Proposed text for Understanding] Platforms today can be operated through a number of different devices including touch, stylus, pen, in addition to mouse and keyboard. Some platforms such as a mobile device are designed to be primarily operated via gestures made on a touchscreen. Other devices can be operated by a number of different devices, such as a pen, stylus, or mouse, which may be generically referred to as a pointer. This section also applies to pointer events on non-mobile platforms.

Mobile device design has evolved away from built-in physical keyboards (e.g. fixed, slide-out) towards devices that maximize touchscreen area and display an on-screen keyboard only when the user has selected a user interface control that accepts text input (e.g. a textbox). Pointer devices such as stylus, pen or pencil have also gained popularity for providing more precise touch. The mouse has been popular on desktop computers for decades.

Although the definition of "pointer" includes touch, we include touch and pointer for clarity. When we use the term touch, we just mean touch.

2.5.1 Touch with Assistive Technology: All functions available by touch are still available by touch after platform assistive technology that remaps touch gestures is turned on. (Level A)

[Proposed text for Understanding] Intent of this Success Criterion

The intent of this Success Criterion is to ensure that content can be operated using gestures on a touch screen with platform assistive technology. Some assistive technology such as a screen reader will change the gestures that are used to interact with content when it is turned on. For example, on both iOS and Android platforms a a single swipe on the touch screen will activate the element that currently has focus. When the system screen reader is turned on, a single tap will move focus to that element and a double tap will activate the element. All functions available by touch when the platform assistive technology is not turned on must be still available when the platform assistive technology is turned on.

Be familiar with your platform's system controls and standards for assistive technology. Use the system controls supported by the platform first and don't overwrite the standard gestures of the platform. Don't use a common gesture for a purpose that is not common.

Specific Benefits of Success Criterion 2.5.1
  • People who are blind who rely on the use of a screen reader while interacting with the touch screen
  • People with low vision who may also need speech turned on while interacting with the touch screen
Examples of Success Criterion 2.5.1
  • If a developer assigns a double tap as a custom gesture, as the only way to complete an action, a user who is blind using VoiceOver will not have access to that action because VoiceOver reserves the double tap to select an item.
  • If a developer assigns a swipe right as the only way to open a menu, the VoiceOver user will not be able to do that action, because VoiceOver takes over the right swipe as a way to move from element to element. To avoid this problem, the developer could ensure there is a mobile menu button that works with touch as another way to bring up the menu.

 

Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 2.5.1
  • Techniques
    • M028 Using standard one-touch controls
    • M027 Providing touch access for custom controls
  • Failures
    • FM002 Infinite scroll gesture is not available with system screen reader
    • FM003 Component can be opened but cannot be closed with touch when a system screen reader is running

2.5.2 No Touch Trap: When touch input behavior is modified by platform assistive technology and focus can be moved to a component, then focus can be moved away from the component using sequential navigation gestures of assistive technology or the user is advised of the method for moving focus away in the sequential focus order. (Level A)

[Proposed text for Understanding] Intent of this Success Criterion

Swipe gestures are useful for displaying dynamic content. Giving focus to dynamic content via a swipe gesture also needs a gesture or method to move focus back to prior content — either by swiping to return or by informing the user of the method needed to return. These methods must work with assistive technology. Explore-by-touch is not a valid solution, because the user can then miss content without knowing it was missed. This success criterion is similar to WCAG 2.1.2 No Keyboard Trap, but it expands to all sequential navigation methods and compensates for touch-specific failure criteria. Relying on touch to explore features of mobile ATs to escape such a trap is still a failure under this criteria, because the next sequential item may be offscreen, an explore-by-touch gesture may cause users to get lost on the page, or the user may be relying on other means of sequential navigation such as a keyboard or switch control.

Specific Benefits of Success Criterion 2.5.2
  • Content that is after or outside of infinite scrolling content, or off the visible screen, can be accessed by screenreader users.
Examples of Success Criterion 2.5.2
  • Infinite scroll of content, where there is additional content in the footer, but the user with assistive technology (e.g. screenreader) cannot move focus to the footer and therefore cannot read the footer content, and may not even know that the footer content exists.
  • An infinite carousel advances with a swipe gesture. The instructions indicate that a touch outside the carousel will exit the carousel. The user can touch outside the carousel with assistive technology (e.g. screenreader) turned on.
  • Popup dialog that cannot be closed when assistive technology is turned on.

Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 2.5.2
  •  FM003 Component can be opened but cannot be closed with touch when a system screen reader is running

2.5.3 Accidental Activation:

For single touch and pointer activation, at least one of the following is true: (Level A)

  1. Activation is on the up-event, either explicitly or implicitly as a platform's generic activation/click event;
  2. A mechanism is available that allows the user to choose the up-event as an option;
  3. Confirmation is provided, which can dismiss activation;
  4. Activation is reversible; or
  5. Timing of activation is essential and waiting for the up-event would invalidate the activity.

Note: This success criteria applies when platform assistive technology (e.g. screen reader) that remaps touch gestures is not turned on.

[Proposed text for Understanding] Intent of this Success Criterion

People with various disabilities can inadvertently initiate touch or mouse events with unwanted results. Up-event activation refers to the activation of a component when the trigger stimulus is released. For example, for touchscreen interaction the event is triggered when a finger is lifted from the touchscreen at the end of a tap. There is a distinction between a finger initially touching a place on the screen and a finger being lifted from that place on the screen. With a mouse there is a similar difference between mousedown (initiating a click) and mouseup (releasing the finger). Authors can reduce the problem of users inadvertently triggering an action by making activation on the up-event. This gives the user the opportunity to move a finger (or mouse/pointer), away from a wrong target that's been hit. If touchdown activation is necessary, there are several options:

  • A confirmation alert allows the user to change their mind
  • An undo button or other mechanism allows the user to reverse the action.
  • A setting in preferences allows the user to choose whether activation happens on the down or up event.

Generic platform activation/click events generally trigger on up and when they do, they are also allowed. For example, in the case of mouse interactions, the "click" event in JavaScript triggers on release of the primary mouse button, and is an example of an implicit up-event. An exception would be an activity that would be invalid if activation waited for the up-event. Examples of this could include a piano program or a program for shooting skeets where waiting for the "up" event would invalidate the activation. Long press activation and 3D touch can be used as long as one of the above listed alternatives is present, and there is another conforming way to provide the action performed by the control.

Anywhere where we say "touch and pointer" we recognize that touch is included in the definition of pointer, but we include touch for clarity and ease of reading.

On different platforms the up-event may be called different things, such as touchend, onclick or mouseup.

Specific Benefits of Success Criterion 2.5.3
  • Makes it easier for all users to recover from hitting the wrong target.
  • Helps people with visual disabilities, cognitive limitations, and motor impairments by reducing the chance that a control will be accidentally activated or an action will occur unexpectedly.
  • Individuals who are unable to detect changes of context are less likely to become disoriented while navigating a site
Examples of Success Criterion 2.5.3
  • Interface elements that require a single tap or a long press as input will only trigger the corresponding event when the finger is lifted inside that element.
  • The user interface control performs an action when the user lifts the finger away from the control rather than when the user first touches the control.
  • A phone-dialing application has number keys that are activated on touchdown. A user can undo an unwanted number by hitting the backspace button to delete a mistaken digit.

Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 2.5.3
  • M029 @@wiki link@@ Touch events are only triggered when touch is removed from a control
  • FM001 Failure of SC 2.5.3 due to activating a button on initial touch location rather than the final touch location
  • Failure @@to be written@@: Actions are only available through long press or 3D touch

2.5.4 Target Size: The size of the target in relation to the visible display at the default viewport size is at least: (Level AA)
  • 44px by 44px for touch inputs
  • 20px by 20px for mouse/pointer inputs
where px is a CSS pixel.

Note: In situations where both touch and pointer/mouse input mechanisms are present, without manual or automatic input detection, controls must follow the larger minimum dimensions for touch input.

Note: This success criteria applies when platform assistive technology (e.g. magnification) is not turned on.

Editor's Note: We are researching the 20px value for mouse/pointer and 44px for touch. We are seeking research on this and outside input. We also have to define the difference between a touch event and a mouse event, particularly in html and responsive environments.

[Proposed text for Understanding] Intent of this Success Criterion

The intent of this success criteria is to help users who may have trouble activating a small target because of hand tremours, limited dexterity or other reasons. Mice and pointing devices can be hard to use for these users, and a larger target will help them greatly in having positive outcomes on the web page. If the target is too small, it may be difficult to aim at the target. This can be further complicated on responsive sites that double as mobile content where the same input will be used with touch. A finger is larger than mouse pointer, and needs a larger target. Touch screens are a primary method of user input on mobile devices. User interface controls must be big enough to capture finger touch actions. The minimum recommended touch target size is 44px by 44px, but a larger touch target is recommended to reduce the possibility of unintentional actions. This is particularly if any of the following are true:

  • The control is used frequently
  • The result of the touch cannot be easily undone
  • The control is positioned where it will be difficult to reach, or is near the edge of the screen
  • The control is part of a sequential task
    Specific Benefits of Success Criterion 2.5.4
    • Users with mobility impairments, such as hand tremors
    • Users who find fine motor movements difficult
    • Users who access a device using one hand
    • Users with large fingers
    • Users who have low vision may better see the target
    Examples of Success Criterion 2.5.4
    • Three buttons are on-screen and the visible portion of each button is 44px by 44px
    Techniques and Failures for Success Criterion 2.5.4
    • M030 Multiple Elements: When multiple elements perform the same action or go to the same destination (e.g. link icon with link text), these should be contained within the same actionable element. This increases the touch target size for all users and benefits people with dexterity impairments. It also reduces the number of redundant focus targets, which benefits people using screen readers and keyboard/switch control.
    • M002 Touch Target: Ensuring that touch targets are at least 44px by 44px. 
    • FM005 Failure: touch target is less than 44px x 44px at the default viewport size

 

A. Glossary

device manipulation
Moving or controlling the device with hands, body or machine. Device manipulation includes other methods of controling input to the mobile device outside of using the touch screen. This includes pressing a physical button on the device, shaking, holding, proximity, touch, walking, angle of holding, input via the accelerometer. Gestures to the camera and voice input to the microphone are addressed separately.
pixel
A CSS pixel based on the ideal viewport device-width. [editor note: we need a better definition of CSS pixel].
platform assistive technology
Platform assistive technology is built into the operating system and is generally updated through OS updates. Examples include VoiceOver on iOS and TalkBack on Android.
pointer
A pointer is a hardware-agnostic representation of input devices that can target a specific coordinate (or set of coordinates) on a screen. [Pointer Events] https://www.w3.org/TR/pointerevents/#dfn-pointer
target
Region of the display that will accept a touch action. If a portion of a touch target is overlapped by another touch target such that it cannot receive touch actions, then that portion is not considered a touch target for purposes of touch target measurements.
touchend
see "up event"
Up event, TouchEnd event
The activation of a component when the trigger stimulus is released. Example: For touchscreen interaction, the event is triggered when a finger is lifted from the touchscreen at the end of a tap.

B. Acknowledgements

Thanks to current members of the Task Force

Allan, Jim
Avila, Jonathan
Babinszki, Tom
Brough, Matthew
Cooper, Michael
Fischer, Detlev
Foliot, John
Garrison, Alistair
Johlic, Marc
Kirkpatrick, Andrew
Lauke, Patrick
MacDonald, David
McMeeking, Chris
Patch, Kimberly
Pluke, Mike
Richards, Jan
Smith, Alan
Spellman, Jeanne
Vaishnav, Jatin
Velleman, Eric
Wahlbin, Kathleen

Thanks to prior members of the Task Force

Anderson, Kathleen
Evans, Gavin
Kaja, Kiran
LaHart, Andrew
McGrane, Karen
Shebanek, Mike
Shiver, Brent
Thiessen, Peter
Wu, Wei
Zehe, Marco

C. References

C.1 Informative references

[WCAG20]
Ben Caldwell; Michael Cooper; Loretta Guarino Reid; Gregg Vanderheiden et al. W3C. Web Content Accessibility Guidelines (WCAG) 2.0. 11 December 2008. W3C Recommendation. URL: http://www.w3.org/TR/WCAG20/