The Mobile Accessibility Extension to WCAG 2.0 provides new guidelines, success criteria, and mobile Techniques that are an addition to Web Content Accessibility Guidelines (WCAG) 2.0. This extension does not replace WCAG 2.0.


This document is the internal working draft used by the Mobile Accessibility Task Force and is updated continuously and without notice. This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This is a draft of a document that will become a First Public Working Draft (FPWD) of the WCAG Extension on Mobile Accessibility. This document catalogs new mobile guidelines, success criteria and techniques proposed by the Mobile A11y Task Force of the WCAG Working Group. .

This document is intended to become a normative extension of Web Content Accessibility Guideline (WCAG 2.0) [[WCAG20]] and is part of a series of technical and educational documents published by the W3C Web Accessibility Initiative (WAI).


The Mobile Accessibility Extension to WCAG 2.0 provides guidance to improve accessibility for people with disabilities. While it generally applies to traditional mobile devices, it also applies to touch-enabled desktop devices, kiosks, tablets and other platforms that use technology beyond the traditional mouse and keyboard. While it is primarily oriented toward web and hybrid content, the guidelines and success criteria may also apply to native mobile applications.

Structure of this Document

This version of the Mobile Accessibility Extension to WCAG is closely tied with WCAG. The early work of the task force focused on writing Techniques for WCAG that applied to mobile. Many existing WCAG techniques apply to mobile in current form and are listed in the Appendix of the W3C Working Group Note: Mobile Accessibility: How WCAG 2.0 and Other W3C/WAI Guidelines Apply to Mobile. The task force has written new Techniques for existing WCAG success criteria that specifically apply to mobile devices. To make it easier to understand the relationship between these new mobile Techniques and WCAG, a bare outline of the WCAG success criteria is included, with the appropriate Mobile Techniques listed below the success criterion. Techniques that are proposed but not written are marked as [Mobile]. Completed or in process mobile Techniques are numbered starting with the letter "M".

New guidelines and success criteria that are not in WCAG 2.0 at the time of publication are marked as [MOBILE]. WCAG Guidelines and success criteria are prefixed with WCAG.

WCAG Principle 1: Perceivable - Information and user interface components must be presentable to users in ways they can perceive.

WCAG Guideline 1.1 Text Alternatives: Provide text alternatives for any non-text content so that it can be changed into other forms people need, such as large print, braille, speech, symbols or simpler language.

1.1.1 Non-text Content

Mobile Technique proposed for WCAG 1.1.1

WCAG Guideline 1.2 Time-based Media: Provide alternatives for time-based media.

1.2.1 Audio-only and Video-only (Prerecorded)

1.2.2 Captions (Prerecorded)

1.2.3 Audio Description or Media Alternative (Prerecorded)

1.2.4 Captions (Live)

1.2.5 Audio Description (Prerecorded)

1.2.6 Sign Language (Prerecorded)

1.2.7 Extended Audio Description (Prerecorded)

1.2.8 Media Alternative (Prerecorded)

1.2.9 Audio-only (Live)

WCAG Guideline 1.3 Adaptable: Create content that can be presented in different ways (for example simpler layout) without losing information or structure.

1.3.1 Info and Relationships

Mobile Techniques proposed for WCAG 1.3.1

1.3.2 Meaningful Sequence

Mobile Technique proposed for WCAG 1.3.2

1.3.3 Sensory Characteristics

Mobile Techniques proposed for WCAG 1.3.3

WCAG Guideline 1.4 Distinguishable: Make it easier for users to see and hear content including separating foreground from background.

1.4.1 Use of Color

1.4.2 Audio Control

1.4.3 Contrast (Minimum)

1.4.4 Resize text

Mobile Techniques proposed for WCAG 1.4.4

1.4.5 Images of Text

1.4.6 Contrast (Enhanced)

1.4.7 Low or No Background Audio

1.4.8 Visual Presentation

Mobile Technique proposed for WCAG 1.4.8

1.4.9 Images of Text (No Exception)

WCAG Principle 2: Operable - User interface components and navigation must be operable.

WCAG Guideline 2.1 Keyboard Accessible: Make all functionality available from a keyboard.

2.1.1 Keyboard

Mobile Techniques proposed for WCAG 2.1.1

  • M011: Ensuring that the interface can be used with a physical keyboard
  • M014: Ensuring that navigation works on different screen sizes

2.1.2 No Keyboard Trap

2.1.3 Keyboard (No Exception)

WCAG Guideline 2.2 Enough Time: Provide users enough time to read and use content.

2.2.1 Timing Adjustable

2.2.2 Pause, Stop, Hide

2.2.3 No Timing

2.2.4 Interruptions

2.2.5 Re-authenticating

WCAG Guideline 2.3 Seizures: Do not design content in a way that is known to cause seizures.

2.3.1 Three Flashes or Below Threshold

2.3.2 Three Flashes

WCAG Guideline 2.4 Navigable: Provide ways to help users navigate, find content, and determine where they are.

2.4.1 Bypass Blocks

2.4.2 Page Titled

2.4.3 Focus Order

2.4.4 Link Purpose (In Context)

2.4.5 Multiple Ways

Mobile Techniques proposed for WCAG 2.4.5

  • M012: Including shortcuts to allow users to jump to sections of the page

2.4.6 Headings and Labels

2.4.7 Focus Visible

Mobile Techniques proposed for WCAG 2.4.7

  • M001 Touch Focus: Defining the hover, focus, selected and touch (regular, long) states

2.4.8 Location

Mobile Techniques proposed for WCAG 2.4.8

  • M015: Providing a way for users to see what page they are on

2.4.9 Link Purpose (Link Only)

2.4.10 Section Headings

[Proposed New MOBILE Guideline] Guideline 2.5: Touch and Pointer: Make it easier for users to operate touch and pointer functionality.

Intent of Guideline 2.5

[Proposed text for Understanding] Platforms today can be operated through a number of different devices including touch, stylus, pen, in addition to mouse and keyboard. Some platforms such as a mobile device are designed to be primarily operated via gestures made on a touchscreen. Other devices can be operated by a number of different devices, such as a pen, stylus, or mouse, which may be generically referred to as pointer inputs. This section also applies to pointer events on non-mobile platforms.

Mobile device design has evolved away from built-in physical keyboards (e.g. fixed, slide-out) towards devices that maximize touchscreen area and display an on-screen keyboard only when the user has selected a user interface control that accepts text input (e.g. a textbox). Pointer devices such as stylus, pen or pencil have also gained popularity for providing more precise touch. The mouse has been popular on desktop computers for decades.

Although the definition of "pointer inputs" includes touch, we include touch and pointer for clarity. When we use the term touch, we just mean touch.

[Proposed New MOBILE Success Criteria] Touch with Assistive Technology: All functions available by touch are still available by touch after platform assistive technology that remaps touch gestures is turned on. (Level A)

[Proposed text for Understanding] Intent of this Success Criterion

The intent of this Success Criterion is to ensure that content can be operated using gestures on a touch screen with platform assistive technology.

Generally, assistive technology such as a screen reader on a touch screen device will change the gestures that are used to interact with content when it is turned on.

For example, on both iOS and Android platforms, when the platform's screen reader (VoiceOver and TalkBack, respectively) is enabled, users will move their focus to the previous/next element using single swipe left/right gestures; using "touch to explore" functionality, a single tap on the touch screen will set focus to the element at that particular location on the screen; a double tap will activate the element.

While content may provide its own gesture-based controls, all functions available by touch when the platform assistive technology is not turned on must be still available when the platform assistive technology is turned on.

Be famililar with your platform's system controls and standards for assistive technology. Use the system controls supported by the platform first and don't overwrite the standard gestures of the platform. Don't use a common gesture for a purpose that is not common.

Specific Benefits of Success Criterion 2.5.1
  • People who are blind who rely on the use of a screen reader while interacting with the touch screen
  • People with low vision who may also need speech turned on while interacting with the touch screen
Examples of Success Criterion 2.5.1
  • If a developer assigns a double tap as a custom gesture, as the only way to complete an action, a user who is blind using VoiceOver will not have access to that action because VoiceOver reserves the double tap to activate an item.
  • If a developer assigns a swipe right as the only way to open a menu, the VoiceOver user will not be able to do that action, because VoiceOver takes over the right swipe as a way to move from element to element. To avoid this problem, the developer could ensure there is a mobile menu button that works with touch as another way to bring up the menu.
Related Resources


Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 2.5.1
  • Techniques
    • M028 Using standard one touch controls
    • M027 Providing touch access for custom controls
  • Failures
    • FM002 Infinite scroll gesture is not available with system screen reader
    • FM003 Component can be opened but cannot be closed with touch when a system screen reader is running

[Proposed New MOBILE Success Criteria] No Touch Trap: When touch input behavior is modified by platform assistive technology and focus can be moved to a component, then focus can be moved away from the I'component using sequential navigation gestures of assistive technology or the user is advised of the method for moving focus away in the sequential focus order. (Level A)

[Proposed text for Understanding] Intent of this Success Criterion

Swipe gestures are useful for displaying dynamic content. Giving focus to dynamic content via a swipe gesture also needs a gesture or method to move focus back to prior content — either by swiping to return or by informing the user of the method needed to return. These methods must work with assistive technology. Explore by touch is not a valid solution, because the user can then miss content without knowing it was missed. This success criterion is similar to WCAG 2.1.2 No Keyboard Trap, however, it expands to all sequential navigation methods and compensates for touch specific failure criteria. Relying on touch-to-explore features of mobile ATs to escape such a trap is still a failure under this criteria, because the next sequential item may be offscreen, an explore by touch gesture may cause users to get lost on the page, or the user may be relying on other means of sequential navigation such as a keyboard or switch control.

Specific Benefits of Success Criterion 2.5.2
  • Content that is after or outside of infinite scrolling content, or off the visible screen, can be accessed by screenreader users.
Examples of Success Criterion 2.5.2
  • Infinite scroll of content, where there is additional content in the footer, but the user with assistive technology (e.g. screenreader) cannot move focus to the footer and therefore cannot read the footer content and may not even know that the footer content exists.
  • An infinite carousel advances with a swipe gesture. The instructions indicate that a touch outside the carousel will exit the carousel. The user can touch outside the carousel with assistive technology (e.g. screenreader) turned on.
  • Popup dialog that cannot be closed when assistive technology is turned on.
Related Resources

Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 2.5.2
  •  FM003 Component can be opened but cannot be closed with touch when a system screen reader is running

[Proposed New MOBILE Success Criteria] Accidental Activation:

For single touch and pointer activation, at least one of the following is true: (Level A)

  1. Activation is on the up-event, either explicitly or implicitly as a platform's generic activation/click event;
  2. A mechanism is available that allows the user to choose the up-event as an option;
  3. Confirmation is provided, which can dismiss activation;
  4. Activation is reversible; or
  5. Timing of activation is essential and waiting for the up-event would invalidate the activity.

Note: This success criteria applies when platform assistive technology (e.g. screen reader) that remaps touch gestures is not turned on.

[Proposed text for Understanding] Intent of this Success Criterion

People with various disabilities can inadvertently initiate touch or mouse events with unwanted results. Up-Event activation refers to the activation of a component when the trigger stimulus is released. For example, for touchscreen interaction the event would be triggered when a finger is lifted from the touchscreen at the end of a tap.There is a distinction between when someone touches a screen and when they remove their finger. On a mouse there is a difference between mouse down (initiating a click) and mouse up (releasing the finger). Authors can reduce the problem of users inadvertently triggering an action, by making activation on the up-event. This gives users the opportunity to move their finger (or mouse/pointer), away from the wrong target once they hit it. If touch down activation is necessary, there are several options:

  • A confirmation alert allows the user to change their mind
  • An undo button or other mechanism allows the user to reverse the action.
  • A setting in preferences allows the user to choose whether activation happens on the down or up event.

Generic platform activation/click events generally trigger on up and when they do, they are also allowed. For example, in the case of mouse interactions, the "click" event in JavaScript triggers on release of the primary mouse button, and is an example of an implicit up-event. An exception would be an activity that would be invalid if activation waited for the up-event. Examples of this could include a piano program or a program for shooting skeets where waiting for the "up" event would invalidate the activation. Long press activation and 3D touch can be used as long as one of the above listed alternatives is present, and there is another conforming way to provide the action performed by the control.

Anywhere where we say "touch and pointer" we recognized that touch is included in the definition of pointer, but we include touch for clarity and ease of reading.

On different platforms the "up-event" may be called different things, such as "touchend", "onclick" or "mouseup" etc..

Specific Benefits of Success Criterion 2.5.3
  • Makes it easier for all users to recover from hitting the wrong target.
  • Helps people with visual disabilities, cognitive limitations, and motor impairments by reducing the chance that a control will be accidentally activated or action will occur unexpectedly.
  • Individuals who are unable to detect changes of context are less likely to become disoriented while navigating a site
Examples of Success Criterion 2.5.3
  • Interface elements that require a single tap or a long press as input will only trigger the corresponding event when the finger is lifted inside that element.
  • The user interface control performs an action when the user lifts the finger away from the control rather than when the user first touches the control.
  • A phone dialing application has number keys that are activated on touch down. A user can undo an unwanted number by hitting the backspace button to delete a mistaken digit.
Related Resources

Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 2.5.3
  • M029 @@wiki link@@ Touch events are only triggered when touch is removed from a control
  • FM001 Failure of SC 2.5.3 due to activating a button on initial touch location rather than the final touch location
  • Failure @@to be written@@: Actions are only available through long press or 3D touch

[Proposed New MOBILE Success Criteria] Target Size: The size of the target in relation to the visible display at the default viewport size is at least: (Level AA)
  • 44px by 44px for pointer inputs with coarse pointing accuracy (such as a touchscreen)
  • 20px by 20px for pointer inputs with fine pointing accuracy (such as a mouse, trackpad or stylus)
where px is a CSS pixel.

Note: In situations where multiple input mechanisms with both coarse and fine pointing accuracy are present, and where no mechanisms are present to determine the user's current input (either manual - e.g. providing the user with an explicit toggle to switch to a "touch-optimized" interface - or automatic - e.g. an application switching dynamically based on the type of input the user is currently using), targets must follow the larger minimum dimensions for coarse pointer inputs.

Note: This success criterion applies when platform assistive technology (e.g. magnification) is not turned on.

Editor's Note: We are researching the 20px value for mouse/pointer and 44px for touch. We are seeking research on this and outside input. We also have to define the difference between a touch event and a mouse event, particularly in html and responsive environments.

Editor's Note: this criterion borrows the distinction of "coarse" and "fine" pointing devices from Media Queries Level 4 - Pointing Device Quality: the pointer feature

[Proposed text for Understanding] Intent of this Success Criterion

The intent of this success criterion is to help users who may have trouble activating a small target because of hand tremors, limited dexterity or other reasons. If the target is too small, it may be difficult to aim at the target. Mice and similar pointing devices can be hard to use for these users, and a larger target will help them greatly in having positive outcomes on the web page.

Touch is particularly problematic as it is an input mechanism with coarse precision. Users lack the same level of fine control as on inputs such as a mouse or stylus. A finger is larger than a mouse pointer, and generally obstructs the user's view of the precise location on the screen that is being touched/activated.


The issue can be further complicated on for responsive/mobile which need to accommodate different types of fine and coarse inputs (e.g. a site that can be accessed both on a traditional desktop/laptop with a mouse, as well as on a tablet or mobile phone with a touch screen).

While this criterion defines a minimum target size, even large sizes are recommended to reduce the possibility of unintentional actions. This is particularly relevant if any of the following are true:

  • the control is used frequently;
  • the result of the interaction cannot be easily undone;
  • the control is positioned where it will be difficult to reach, or is near the edge of the screen;
  • the control is part of a sequential task.
Specific Benefits of Success Criterion 2.5.4
  • Users with mobility impairments, such as hand tremors
  • Users who find fine motor movements difficult
  • Users who access a device using one hand
  • Users with large fingers, or who are operating the device with only a part of their finger or knuckle
  • Users who have low vision may better see the target
Examples of Success Criterion 2.5.4
  • Three buttons are on-screen and the visible portion of each button is 44px by 44px
Related Resources
Techniques and Failures for Success Criterion 2.5.4
  • M030 Multiple Elements: When multiple elements perform the same action or go to the same destination (e.g. link icon with link text), these should be contained within the same actionable element. This increases the touch target size for all users and benefits people with dexterity impairments. It also reduces the number of redundant focus targets, which benefits people using screen readers and keyboard/switch control.
  • M002 Touch Target: Ensuring that touch targets are at least 44px by 44px. 
  • FM005 Failure: touch target is less than 44px x 44px at the default viewport size

[Proposed New MOBILE Guideline] Guideline 2.6: Make it easier to use the physical features of the phone.

Intent of Guideline 2.6

[Proposed text for Understanding]

[Proposed New MOBILE Success Criteria] Device manipulation: When device manipulation gestures are provided, touch and keyboard operable alternative control options are available. (Level AA)

[Proposed text for Understanding] Intent of this Success Criterion

While device operating system is responsible for providing alternatives for using the device buttons and gestures (e.g. shaking, holding, proximity, touch, voice, walking, looking at, angle of holding, etc.), when the content makes use of these gestures in a customized manner not recognized by the operating system, then touch or keyboard alternatives are provided. For example, a shaking motion to undo or cancel may be unavailable to a person whose mobile device is secured to a wheelchair.

Specific Benefits of Success Criterion 2.6.1
Examples of Success Criterion 2.6.1
  • Example 1
Related Resources

Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 2.6.1
  • M010: Allowing users to interact using device buttons (e.g. arrow keys, ok button)

[Proposed New MOBILE Success Criteria] Visual gestures to the camera

[Proposed New MOBILE Success Criteria] Voice control using microphone

[Proposed New MOBILE Guideline] Guideline 2.7: Make it practical for speech input users to operate all functionality

[Proposed text for Understanding]

[Proposed New MOBILE Success Criteria] Speech Input: All functionality of the content (including touch and gesture) is operable through the keyboard, and does not obstruct a user's ability to access the keyboard commands through speech input. (Level A)

[Proposed text for Understanding]

Intent of this Success Criterion

One means of speech input is speaking keyboard controls. Users can also write custom speech commands that can call keyboard controls. This means that, in general, anything that is accessible by keyboard is accessible by speech.

Specific Benefits of Success Criterion 2.7.1
Examples of Success Criterion 2.7.1
  • Example 1
Related Resources

Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 2.7.1

[Proposed New MOBILE Success Criteria] Single key shortcut alternative: The user can adjust any single key shortcut to an alternative control of a string of symbols and letters.

[Proposed text for Understanding]

Intent of this Success Criterion

While using single letter keys as controls might be appropriate and efficient for keyboard users, single key shortcuts are disastrous for speech users, who can inadvertently set off multiple controls by speaking a single phrase. To avoid excluding speech users, single key shortcuts must be accompanied by a mechanism that allows the user to customize single key shortcuts. For example, the user could change the single key shortcut "r" for reply to "+r" or to "This Reply".

Specific Benefits of Success Criterion 2.7.2
Examples of Success Criterion 2.7.2
  • Example 1
Related Resources

Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 2.7.2

Principle 3: Understandable - Information and the operation of user interface must be understandable.

WCAG Guideline 3.1 Readable: Make text content readable and understandable.

3.1.1 Language of Page

3.1.2 Language of Parts

3.1.3 Unusual Words

3.1.4 Abbreviations

3.1.5 Reading Level

3.1.6 Pronunciation

WCAG Guideline 3.2 Predictable: Make Web pages appear and operate in predictable ways.

3.2.1 On Focus

3.2.2 On Input

3.2.3 Consistent Navigation

WCAG Guideline 3.3 Input Assistance: Help users avoid and correct mistakes.

3.3.1 Error Identification

3.3.2 Labels or Instructions

Mobile Technique proposed for WCAG 3.3.2

3.3.3 Error Suggestion

3.3.4 Error Prevention (Legal, Financial, Data)

3.3.5 Help

3.3.6 Error Prevention (All)

[Proposed New MOBILE Guideline] Guideline 3.4 Make content usable in device orientations.[MOBILE]

3.4.1 Orientation: Orientation of the content is not locked to landscape or portrait, except where orientation is essential

[Proposed text for Understanding] Intent of this Success Criterion

Some mobile applications automatically set the screen to a particular display orientation (landscape or portrait) and expect that users will respond by rotating the mobile device to match. However, some users have their mobile devices mounted in a fixed orientation (e.g. on the arm of a power wheelchair).

Therefore, mobile application developers should try to support both orientations.

Specific Benefits of Success Criterion 3.4.1
Examples of problems
Examples of essential

Related Resources

Resources are for information purposes only, no endorsement implied.

(none currently documented)

Techniques and Failures for Success Criterion 3.4.1


Principle 4: Robust - Content must be robust enough that it can be interpreted reliably by a wide variety of user agents, including assistive technologies.

Guideline 4.1 Compatible: Maximize compatibility with current and future user agents, including assistive technologies.

4.1.1 Parsing

4.1.2 Name, Role, Value

Mobile Technique proposed for WCAG 4.1.2

[Proposed New MOBILE Success Criteria] 4.1.3 Non-interference of AT: Content does not interfere with default functionality of platform level assistive technology

Mobile Technique proposed for WCAG 4.1.3

Techniques with No Home

M003 Touch Activation: Activating elements via the touchend event



device manipulation
Moving or controlling the device with hands, body or machine. Device manipulation includes other methods of controling input to the mobile device outside of using the touch screen. This includes: pressing a physical button on the device, shaking, holding, proximity, touch, walking, angle of holding, input via the accelerometer etc. Gestures to the camera and voice input to the microphone are addressed separately.
A CSS pixel based on the ideal viewport device-width. [editor note: we need a better definition of CSS pixel].
platform assistive technology
Platform assistive technology is built into the operating system and is generally updated through OS updates. Examples include VoiceOver on iOS and TalkBack on Android.
pointer input
generic term for an input device (such as a mouse, stylus, or touchscreen) that can target a specific coordinate (or set of coordinates) on a screen. See also [Pointer Events]
Region of the display that will accept a touch action. If a portion of a touch target is overlapped by another touch target such that it cannot receive touch actions, then that portion is not considered a touch target for purposes of touch target measurements.
see "up event"
up event, touchend event
The activation of a component when the trigger stimulus is released. Example: For touchscreen interaction, the event is triggered when a finger is lifted from the touchscreen at the end of a tap.


Thanks to current members of the Task Force

Allan, Jim
Avila, Jonathan
Babinszki, Tom
Brough, Matthew
Cooper, Michael
Fischer, Detlev
Foliot, John
Garrison, Alistair
Johlic, Marc
Kirkpatrick, Andrew
Lauke, Patrick
MacDonald, David
McMeeking, Chris
Patch, Kimberly
Pluke, Mike
Richards, Jan
Smith, Alan
Spellman, Jeanne
Vaishnav, Jatin
Velleman, Eric
Wahlbin, Kathleen

Thanks to prior members of the Task Force

Anderson, Kathleen
Evans, Gavin
Kaja, Kiran
LaHart, Andrew
McGrane, Karen
Shebanek, Mike
Shiver, Brent
Thiessen, Peter
Wu, Wei
Zehe, Marco