Copyright © 2016 W3C® (MIT, ERCIM, Keio, Beihang). W3C liability, trademark and document use rules apply.
The Touch and Pointer Guideline provides new success criteria, techniques and failures that are an addition to Web Content Accessibility Guidelines (WCAG) 2.0. This does not replace WCAG 2.0.
This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at http://www.w3.org/TR/.
This document is the internal working draft used by the Mobile Accessibility Task Force and is updated continuously and without notice. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.
This is a draft of the Touch and Pointer Guideline, which catalogs a new mobile guideline on Touch and Pointer, its success criteria, Understanding entries, and Techniques proposed by the Mobile A11y Task Force of the WCAG Working Group.
When complete, this document is intended to become a normative extension of Web Content Accessibility Guideline (WCAG 2.0) [WCAG20]. This is part of a series of technical and educational documents published by the W3C Web Accessibility Initiative (WAI).
The Mobile Accessibility Task Force would particularly appreciate feedback on the following success criteria:
This document was published by the Mobile Accessibililty Task Force as an Editor's Draft. If you wish to make comments regarding this document, please send them to public-mobile-a11y-tf@w3.org (subscribe, archives) or create a Github issue at https://github.com/w3c/Mobile-A11y-Extension/issues. All comments are welcome.
Publication as an Editor's Draft does not imply endorsement by the W3C Membership. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.
This document was produced by a group operating under the 5 February 2004 W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.
This document is governed by the 1 September 2015 W3C Process Document.
The document provides guidance to improve accessibility for people with disabilities. While it generally applies to traditional mobile devices, it also applies to touch-enabled desktop devices, kiosks, tablets and other platforms that use technology beyond the traditional mouse and keyboard. While it is primarily oriented toward web and hybrid content, the guidelines and success criteria may also apply to native mobile applications.
This version of the Touch and Pointer Accessibility Guideline is closely tied with WCAG. The early work of the task force focused on writing Techniques for WCAG that applied to mobile. Many existing WCAG techniques apply to mobile in current form and are listed in the Appendix of the W3C Working Group Note: Mobile Accessibility: How WCAG 2.0 and Other W3C/WAI Guidelines Apply to Mobile.
This new guideline, success criteria, and techniques are not in WCAG 2.0. It is being determined by the WCAG Working Group how this will be incorporated into WCAG 2.0. For convenience of discussion, the proposed Touch and Pointer Guideline has been tentatively numbered as 2.5, since the Task Force proposes that it belongs with WCAG Principle 2: Operable.
[Proposed text for Understanding] Platforms today can be operated through a number of different devices including touch, stylus, pen, in addition to mouse and keyboard. Some platforms such as a mobile device are designed to be primarily operated via gestures made on a touchscreen. Other devices can be operated by a number of different devices, such as a pen, stylus, or mouse, which may be generically referred to as a pointer. This section also applies to pointer events on non-mobile platforms.
Mobile device design has evolved away from built-in physical keyboards (e.g. fixed, slide-out) towards devices that maximize touchscreen area and display an on-screen keyboard only when the user has selected a user interface control that accepts text input (e.g. a textbox). Pointer devices such as stylus, pen or pencil have also gained popularity for providing more precise touch. The mouse has been popular on desktop computers for decades.
Although the definition of "pointer" includes touch, we include touch and pointer for clarity. When we use the term touch, we just mean touch.
The intent of this Success Criterion is to ensure that content can be operated using gestures on a touch screen with platform assistive technology. Some assistive technology such as a screen reader will change the gestures that are used to interact with content when it is turned on. For example, on both iOS and Android platforms a a single swipe on the touch screen will activate the element that currently has focus. When the system screen reader is turned on, a single tap will move focus to that element and a double tap will activate the element. All functions available by touch when the platform assistive technology is not turned on must be still available when the platform assistive technology is turned on.
Be familiar with your platform's system controls and standards for assistive technology. Use the system controls supported by the platform first and don't overwrite the standard gestures of the platform. Don't use a common gesture for a purpose that is not common.
Resources are for information purposes only, no endorsement implied.
(none currently documented)
Swipe gestures are useful for displaying dynamic content. Giving focus to dynamic content via a swipe gesture also needs a gesture or method to move focus back to prior content — either by swiping to return or by informing the user of the method needed to return. These methods must work with assistive technology. Explore-by-touch is not a valid solution, because the user can then miss content without knowing it was missed. This success criterion is similar to WCAG 2.1.2 No Keyboard Trap, but it expands to all sequential navigation methods and compensates for touch-specific failure criteria. Relying on touch to explore features of mobile ATs to escape such a trap is still a failure under this criteria, because the next sequential item may be offscreen, an explore-by-touch gesture may cause users to get lost on the page, or the user may be relying on other means of sequential navigation such as a keyboard or switch control.
Resources are for information purposes only, no endorsement implied.
(none currently documented)
Note: This success criteria applies when platform assistive technology (e.g. screen reader) that remaps touch gestures is not turned on.
People with various disabilities can inadvertently initiate touch or mouse events with unwanted results. Up-event activation refers to the activation of a component when the trigger stimulus is released. For example, for touchscreen interaction the event is triggered when a finger is lifted from the touchscreen at the end of a tap. There is a distinction between a finger initially touching a place on the screen and a finger being lifted from that place on the screen. With a mouse there is a similar difference between mousedown (initiating a click) and mouseup (releasing the finger). Authors can reduce the problem of users inadvertently triggering an action by making activation on the up-event. This gives the user the opportunity to move a finger (or mouse/pointer), away from a wrong target that's been hit. If touchdown activation is necessary, there are several options:
Generic platform activation/click events generally trigger on up and when they do, they are also allowed. For example, in the case of mouse interactions, the "click" event in JavaScript triggers on release of the primary mouse button, and is an example of an implicit up-event. An exception would be an activity that would be invalid if activation waited for the up-event. Examples of this could include a piano program or a program for shooting skeets where waiting for the "up" event would invalidate the activation. Long press activation and 3D touch can be used as long as one of the above listed alternatives is present, and there is another conforming way to provide the action performed by the control.
Anywhere where we say "touch and pointer" we recognize that touch is included in the definition of pointer, but we include touch for clarity and ease of reading.
On different platforms the up-event may be called different things, such as touchend, onclick or mouseup.
Resources are for information purposes only, no endorsement implied.
(none currently documented)
Note: In situations where both touch and pointer/mouse input mechanisms are present, without manual or automatic input detection, controls must follow the larger minimum dimensions for touch input.
Note: This success criteria applies when platform assistive technology (e.g. magnification) is not turned on.
Editor's Note: We are researching the 20px value for mouse/pointer and 44px for touch. We are seeking research on this and outside input. We also have to define the difference between a touch event and a mouse event, particularly in html and responsive environments.
The intent of this success criteria is to help users who may have trouble activating a small target because of hand tremours, limited dexterity or other reasons. Mice and pointing devices can be hard to use for these users, and a larger target will help them greatly in having positive outcomes on the web page. If the target is too small, it may be difficult to aim at the target. This can be further complicated on responsive sites that double as mobile content where the same input will be used with touch. A finger is larger than mouse pointer, and needs a larger target. Touch screens are a primary method of user input on mobile devices. User interface controls must be big enough to capture finger touch actions. The minimum recommended touch target size is 44px by 44px, but a larger touch target is recommended to reduce the possibility of unintentional actions. This is particularly if any of the following are true:
Thanks to current members of the Task Force
Allan, Jim
Avila, Jonathan
Babinszki, Tom
Brough, Matthew
Cooper, Michael
Fischer, Detlev
Foliot, John
Garrison, Alistair
Johlic, Marc
Kirkpatrick, Andrew
Lauke, Patrick
MacDonald, David
McMeeking, Chris
Patch, Kimberly
Pluke, Mike
Richards, Jan
Smith, Alan
Spellman, Jeanne
Vaishnav, Jatin
Velleman, Eric
Wahlbin, Kathleen
Thanks to prior members of the Task Force
Anderson, Kathleen
Evans, Gavin
Kaja, Kiran
LaHart, Andrew
McGrane, Karen
Shebanek, Mike
Shiver, Brent
Thiessen, Peter
Wu, Wei
Zehe, Marco