Generic Sensor API

Editor’s Draft,

This version:
https://w3c.github.io/sensors/
Latest published version:
https://www.w3.org/TR/generic-sensor/
Previous Versions:
Feedback:
public-device-apis@w3.org with subject line “[generic-sensor] … message topic …” (archives)
GitHub (new issue, level 1 issues, all issues)
Editors:
Rick Waldron (JS Foundation)
Mikhail Pozdnyakov (Intel Corporation)
Alexander Shalamov (Intel Corporation)
Former Editor:
Tobie Langel (Codespeaks, formerly on behalf of Intel Corporation)
Other:
Test suite, latest version history, previous version history

Abstract

This specification defines a framework for exposing sensor data to the Open Web Platform in a consistent way. It does so by defining a blueprint for writing specifications of concrete sensors along with an abstract Sensor interface that can be extended to accommodate different sensor types.

Status of this document

This is a public copy of the editors’ draft. It is provided for discussion only and may change at any moment. Its publication here does not imply endorsement of its contents by W3C. Don’t cite this document other than as work in progress.

If you wish to make comments regarding this document, please send them to public-device-apis@w3.org (subscribe, archives). When sending e-mail, please put the text “generic-sensor” in the subject, preferably like this: “[generic-sensor] …summary of comment…”. All comments are welcome.

This document was produced by the Device and Sensors Working Group.

This document was produced by a group operating under the 5 February 2004 W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 1 March 2017 W3C Process Document.

1. Introduction

Increasingly, sensor data is used in application development to enable new use cases such as geolocation, counting steps or head-tracking. This is especially true on mobile devices where new sensors are added regularly.

Exposing sensor data to the Web has so far been both slow-paced and ad-hoc. Few sensors are already exposed to the Web. When they are, it is often in ways that limit their possible use cases (for example by exposing abstractions that are too high-level and which don’t perform well enough). APIs also vary greatly from one sensor to the next which increases the cognitive burden of Web application developers and slows development.

The goal of the Generic Sensor API is to promote consistency across sensor APIs, enable advanced use cases thanks to performant low-level APIs, and increase the pace at which new sensors can be exposed to the Web by simplifying the specification and implementation processes.

This lacks an informative section with examples for developers. Should contain different use of the API, including using it in conjunction with requestAnimationFrame.

2. Scope

This section is non-normative.

The scope of this specification is currently limited to specifying primitives which enable expose data from local sensors.

Exposing remote sensors or sensors found on personal area networks (e.g. Bluetooth) is out of scope. As work in these areas mature, it is possible that common, lower-level primitives be found, in which case this specification will be updated accordingly. This should have little to no effects on implementations, however.

This specification also does not currently expose a sensor discovery API. This is because the limited number of sensors currently available to user agents does not warrant such an API. Using feature detection, such as described in §4 A note on Feature Detection of Hardware Features, is good enough for now. A subsequent version of this specification might specify such an API, and the current API has been designed with this in mind.

3. Background

This section is non-normative.

This section is ill-named. It principally covers default sensors and explains the reasoning behind them. It should be renamed accordingly and moved, either to another section of the spec or to an external explainer document.

The Generic Sensor API is designed to make the most common use cases straightforward while still enabling more complex use cases.

Most devices deployed today do not carry more than one sensor of each sensor types. This shouldn’t come as a surprise since use cases for more than a sensor of a given type are rare and generally limited to specific sensor types, such as proximity sensors.

The API therefore makes it easy to interact with the device’s default (and often unique) sensor for each type simply by instantiating the corresponding Sensor subclass.

Indeed, without specific information identifying a particular sensor of a given type, the default sensor is chosen.

Listening to geolocation changes:
let sensor = new GeolocationSensor({ accuracy: "high" });

sensor.onreading = function(event) {
    var coords = [sensor.latitude, sensor.longitude];
    updateMap(null, coords, sensor.accuracy);
};

sensor.onerror = function(error) {
    updateMap(error);
};
sensor.start();

Note: extension to this specification may choose not to define a default sensor when doing so wouldn’t make sense. For example, it might be difficult to agree on an obvious default sensor for proximity sensors.

In cases where multiple sensors of the same type may coexist on the same device, specification extension will have to define ways to uniquely identify each one.

For example checking the pressure of the left rear tire:
var sensor = new DirectTirePressureSensor({ position: "rear", side: "left" });
sensor.onreading = _ => console.log(sensor.pressure);
sensor.start();

4. A note on Feature Detection of Hardware Features

This section is non-normative.

Feature detection is an established Web development best practice. Resources on the topic are plentiful on and offline and the purpose of this section is not to discuss it further, but rather to put it in the context of detecting hardware-dependent features.

Consider the below feature detection examples:

if (typeof Gyroscope === "function") {
    // run in circles...
}

if ("ProximitySensor" in window) {
    // watch out!
}

if (window.AmbientLightSensor) {
    // go dark...
}

// etc.

All of these tell you something about the presence and possible characteristics of an API. They do not tell you anything, however, about whether that API is actually connected to a real hardware sensor, whether that sensor works, if its still connected, or even whether the user is going to allow you to access it. Note you can check the latter using the Permissions API [PERMISSIONS].

In an ideal world, information about the underlying status would be available upfront. The problem with this is twofold. First, getting this information out of the hardware is costly, in both performance and battery time, and would sit in the critical path. Secondly, the status of the underlying hardware can evolve over time. The user can revoke permission, the connection to the sensor be severed, the operating system may decide to limit sensor usage below a certain battery threshold, etc.

Therefore, an effective strategy is to combine feature detection, which checks whether an API for the sought-after sensor actually exists, and defensive programming which includes:

  1. checking for error thrown when instantiating a Sensor object,

  2. listening to errors emitted by it,

  3. handling all of the above graciously so that the user’s experience is enhanced by the possible usage of a sensor, not degraded by its absence.

try { // No need to feature detect thanks to try..catch block.
    var sensor = new GeolocationSensor();
    sensor.start();
    sensor.onerror = error => gracefullyDegrade(error);
    sensor.onreading = _ => updatePosition(sensor.latitude, sensor.longitude);
} catch(error) {
    gracefullyDegrade(error);
}

5. Security and privacy considerations

Sensor readings are sensitive data and could become a subject of various attacks from malicious Web pages. Before discussing the mitigation strategies we briefly enumerate the main types of the sensor's privacy and security threats. The [MOBILESENSORS] categorizes main threats into location tracking, eavesdropping, keystroke monitoring, device fingerprinting, and user identification. This categorization is a good fit for this specification.

The risk of successful attack can increase when sensors are used with each other, in combination with other functionality, or when used over time, specifically with the risk of correlation of data and user identification through fingerprinting. Web application developers using these JavaScript APIs should consider how this information might be correlated with other information and the privacy risks that might be created. The potential risks of collection of such data over a longer period of time should also be considered.

Variations in sensor readings as well as event firing rates offer the possibility of fingerprinting to identify users. User agents may reduce the risk by limiting event rates available to web application developers.

Note: do we really want this mitigation strategy?

Frequency polling in periodic reporting mode might allow the fingerprinting of hardware or implementation types, by probing which actual frequencies are supported by the platform.

Minimizing the accuracy of a sensor’s readout generally decreases the risk of fingerprinting. User agents should not provide unnecessarily verbose readouts of sensors data. Each sensor type should be assessed individually.

If the same JavaScript code using the API can be used simultaneously in different window contexts on the same device it may be possible for that code to correlate the user across those two contexts, creating unanticipated tracking mechanisms.

User agents should consider providing the user an indication of when the sensor is used and allowing the user to disable it. Additionally, user agents may consider allowing the user to verify past and current sensor use patterns.

Web application developers that use sensors should perform a privacy impact assessment of their application taking all aspects of their application into consideration.

Ability to detect a full working set of sensors on a device can form an identifier and could be used for fingerprinting.

A combination of selected sensors can potentially be used to form an out of band communication channel between devices.

Sensors can potentially be used in cross-device linking and tracking of a user.

5.1. Types of privacy and security threats

This section is non-normative.

5.1.1. Location Tracking

Under this type of threat, the attacks use sensor readings to locate the device without using GPS or any other location sensors. For example, accelerometer data can be used to infer the location of smartphones by using statistical models to obtain estimated trajectory, then map matching algorithms can be used to obtain predicted location points (within a 200-m radius)[MOBILESENSORS].

5.1.2. Eavesdropping

Recovering speech from gyroscope readings is an example of eavesdropping attack. See [GYROSPEECHRECOGNITION].

5.1.3. Keystroke Monitoring

Many user inputs can be inferred from sensor readings, this includes a wide range of attacks on user PINs, passwords, and lock patterns (and even touch actions such as click, scroll, and zoom) using motion sensors. These attacks normally train a machine learning algorithm to discover such information about the users. See [STEALINGPINSVIASENSORS].

5.1.4. Device Fingerprinting

Sensors can provide information that can uniquely identify the device using those sensors. Every concrete sensor model has minor manufacturing imperfections and differences that will be unique for this model. These manufacturing variations and imperfections can be used to fingerprint the device [MOBILESENSORS].

5.1.5. User Identifying

Sensor readings can be used to identify the user, for example via inferring individual walking patterns from smartphone or wearable device motion sensors' data.

5.2. Mitigation Strategies

This section is non-normative.

This section gives a high-level presentation of some of the mitigation strategies specified in the normative sections of this specification.

5.2.1. Secure Context

Sensor readings are explicitly flagged by the Secure Contexts specification [POWERFUL-FEATURES] as a high-value target for network attackers. Thus all interfaces defined by this specification or extension specifications are only available within a secure context.

5.2.2. Top-Level Browsing Context

To avoid the privacy risk of sharing sensor readings with contexts unfamiliar to the user, sensor readings are only available in the top-level browsing context.

Note: [FEATURE-POLICY] will allow securely relaxing those restrictions once it matures.

5.2.3. Losing Focus

When the top-level browsing context loses focus, or when a nested browsing context of a different origin gains focus (for example when the user carries out an in-game purchase using a third party payment service from within an iframe) the top-level browsing contexts suddenly becomes in a position to carry out a skimming attack against the browsing context that has gained focus.

To mitigate this threat, readings of sensors running in a top-level browsing contexts are not delivered in such cases. A security check is run before sensor readings are delivered to ensure that.

5.2.4. Visibility State

Sensor readings are only available in browsing contexts that are visible to the user, that is, whose visibility state is "visible". A security check is run before sensor readings are delivered to ensure that.

certain use cases require sensors to have background access. Using a more complex PermissionDescriptor. (e.g. with a boolean allowBackgroundUsage = false; dictionary member), might be the solution to relax this restriction.

5.2.5. Permissions API

Access to sensor readings are controlled by the Permissions API [PERMISSIONS]. User agents use a number of criteria to grant access to the readings. Note that access can be granted without prompting the user.

5.3. Mitigation strategies applied on a case by case basis

Each sensor type will need to be assessed individually, taking into account the use cases it enables and its particular threat profile. While some of the below mitigation strategies are effective for certain sensors, they might also hinder or altogether prevent certain use cases.

Note: These mitigation strategies can be applied constantly or temporarily, for example when the user is carrying out specific actions, when other APIs which are known to amplify the level of the threat are in use, etc.

5.3.1. Limit maximum sampling frequency

User agents may mitigate certain threats by limiting the maximum sampling frequency. What upper limit to choose depends on the sensor type, the kind of threats the user agent is trying to protect against, the expected resources of the attacker, etc.

Limiting the maximum sampling frequency prevents use cases which rely on low latency or high data density.

5.3.2. Stop the sensor altogether

This is obviously a last-resort solution, but it can be extremely effective if it’s temporal, for example to prevent password skimming attempts when the user is entering credentials on a different origin ([rfc6454]) or in a different application.

5.3.3. Limit number of delivered readings

An alternative to limiting the maximum sampling frequency is to limit the number of sensor readings delivered to Web application developer, regardless of what frequency the sensor is polled at. This allows use cases which have low latency requirement to increase sampling frequency without increasing the amount of data provided.

Discarding intermediary readings prevents certain use cases, such as those relying on certain kinds of filters.

5.3.4. Reduce accuracy

Reducing the accuracy of sensor readings or sensor reading timestamps might also help mitigate certain threats, thus user agents should not provide unnecessarily verbose readouts of sensors data.

However, certain use cases require highly accurate readings, especially when operations carried out on the readings, or time deltas calculated from the timestamps, increase inaccuracies exponentially.

Note: while adding random bias to sensor readings has similar effects, it shouldn’t be used in practice as it is easy to filter out the added noise.

5.3.5. Keep the user informed about API use

User agents may choose to keep the user informed about current and past use of the API.

Note: this does not imply keeping a log of the actual sensor readings which would have issues of its own.

6. Concepts

6.1. Sensors

A sensor measures different physical quantities and provide corresponding raw sensor readings which are a source of information about the user and their environment.

Each reading is composed of the values of the different physical quantities measured by the sensor at time tn.

Known, predictable discrepancies between raw sensor readings and the corresponding physical quantities being measured are corrected through calibration.

Known but unpredictable discrepancies need to be addressed dynamically through a process called sensor fusion.

Calibrated raw sensor readings are referred to as sensor readings, whether or not they have undergone sensor fusion.

6.2. Sensor Types

Different sensor types measure different physical quantities such as temperature, air pressure, heart-rate, or luminosity.

For the purpose of this specification we distinguish between high-level and low-level sensor types.

Sensor types which are characterized by their implementation are referred to as low-level sensors. For example a Gyroscope is a low-level sensor type.

Sensors named after their readings, regardless of the implementation, are said to be high-level sensors. For instance, geolocation sensors provide information about the user’s location, but the precise means by which this data is obtained is purposefully left opaque (it could come from a GPS chip, network cell triangulation, wifi networks, etc. or any combination of the above) and depends on various, implementation-specific heuristics. High-level sensors are generally the fruits of applying algorithms to low-level sensors—for example, a pedometer can be built using only the output of a gyroscope—or of sensor fusion.

That said, the distinction between high-level and low-level sensor types is somewhat arbitrary and the line between the two is often blurred. For instance, a barometer, which measures air pressure, would be considered low-level for most common purposes, even though it is the product of the sensor fusion of resistive piezo-electric pressure and temperature sensors. Exposing the sensors that compose it would serve no practical purpose; who cares about the temperature of a piezo-electric sensor? A pressure-altimeter would probably fall in the same category, while a nondescript altimeter—which could get its data from either a barometer or a GPS signal—would clearly be categorized as a high-level sensor type.

Because the distinction is somewhat blurry, extensions to this specification (see §10 Extensibility) are encouraged to provide domain-specific definitions of high-level and low-level sensors for the given sensor types they are targeting.

Sensor readings from different sensor types can be combined together through a process called sensor fusion. This process provides higher-level or more accurate data (often at the cost of increased latency). For example, the readings of a three-axis magnetometer needs to be combined with the readings of an accelerometer to provide a correct bearing.

Smart sensors and sensor hubs have built-in compute resources which allow them to carry out calibration and sensor fusion at the hardware level, freeing up CPU resources and lowering battery consumption in the process.

But sensor fusion can also be carried out in software. This is particularly useful when performance requirements can only be met by relying on application-specific data. For example, head tracking for virtual or augmented reality applications requires extremely low latency to avoid causing motion sickness. That low-latency is best provided by using the raw output of a gyroscope and waiting for quick rotational movements of the head to compensate for drift.

Note: sensors created through sensor fusion are sometimes called virtual or synthetic sensors. However, the specification doesn’t make any practical differences between them, preferring instead to differentiate sensors as to whether they describe the kind of readings produced--these are high-level sensors—or how the sensor is implemented (low-level sensors).

6.3. Reporting Modes

This feature is at risk. It is not clear whether there is value in splitting up sensor types between those that fire events at regular intervals and those which don’t.

Sensors have different reporting modes. When sensor readings are reported at regular intervals, at an adjustable frequency measured in hertz (Hz), the reporting mode is said to be periodic. On sensor types with support for periodic reporting mode, periodic reporting mode is triggered by requesting a specific frequency.

Sensor types which do not support periodic reporting mode are said to operate in an implementation specific way. When the reporting mode is implementation specific, sensor readings may be provided at regular intervals, irregularly, or only when a reading change is observed. This allows user agents more latitude to carry out power- or CPU-saving strategies, and support multiple hardware configurations. Periodic reporting mode, on the other hand, allows a much more fine-grained approach and is essential for use cases with, for example, low latency requirements.

Sensors which support periodic reporting mode fallback to implementation specific reporting mode when no requirements are made as to what frequency they should operate at.

Note: reporting mode is distinct from, but related to, sensor readings acquisition. If sensors are polled at regular interval, as is generally the case, reporting mode can be either periodic or implementation specific. However, when the underlying implementation itself only provides sensor readings when it measures change, perhaps because is is relying on smart sensors or a sensor hubs, the reporting mode cannot be periodic, as that would require data inference.

This lacks a description of the different data acquisition modes, notably polling vs. on change, both at the platform and HW layer.

It would be useful to describe the process of sensor sampling and how increased sensor sampling frequency decreases latency.

A definition of sensor accuracy and how it affects threshold, and thus "on change" sensors would be useful.

6.4. Sampling Frequency

For the purpose of this specification, sampling frequency for a sensor is defined as a frequency at which the UA obtains sensor readings from the underlying platform.

7. Model

A diagram would really help here.

7.1. Sensor Type

A sensor type has an associated interface whose inherited interfaces contains Sensor.

A sensor type has a set of associated sensors.

If a sensor type has more than one sensor, it must have a set of associated identifying parameters to select the right sensor to associate to each new Sensor objects.

A sensor type may have a default sensor.

A sensor type has an associated PermissionName.

Note: multiple sensor types may share the same PermissionName.

A sensor type has a permission revocation algorithm.

To invoke the permission revocation algorithm with PermissionName permission_name, run the following steps:

  1. For each sensor_type which has an associated PermissionName permission_name:

    1. For each sensor in sensor_type’s set of associated sensors,

      1. Invoke revoke sensor permission with sensor as argument.

7.2. Sensor

A sensor has an associated set of activated sensor objects. This set is initially empty.

The current browsing context's sensor has an associated latest reading map which holds the latest available sensor readings.

Note: User agents can share sensor readings map between different contexts only if the origins of these contexts' active documents are same origin-domain.

Any time the user agent obtains a new sensor reading for a sensor from the underlying platform, it invokes update latest reading with the sensor and the sensor reading as arguments.

The latest reading map contains an entry whose key is "timestamp" and whose value is a high resolution timestamp of the time at which the latest reading was obtained expressed in milliseconds that passed since the time origin. latest reading["timestamp"] is initially set to null, unless the latest reading map caches a previous reading.

The other entries of the latest reading map hold the values of the different quantities measured by the sensor. The keys of these entries must match the attribute identifier defined by the sensor type's associated interface. The return value of the attribute getter is easily obtained by invoking get value from latest reading with the object implementing the sensor type's associated interface and the attribute identifier as arguments.

The value of all latest reading entries is initially set to null.

A sensor has an associated current sampling frequency which is initially null.

For a nonempty set of activated sensor objects the current sampling frequency is equal to optimal sampling frequency, which is estimated by the UA taking into account desired sampling frequencies of activated Sensors and sampling frequency bounds defined by the underlying platform.

Note: For example, the UA may estimate optimal sampling frequency as a Least Common Denominator (LCD) for a set of desired sampling frequencies capped by sampling frequency bounds defined by the underlying platform.

8. API

8.1. The Sensor Interface

[SecureContext, Exposed=Window]
interface Sensor : EventTarget {
  readonly attribute boolean activated;
  readonly attribute DOMHighResTimeStamp? timestamp;
  void start();
  void stop();
  attribute EventHandler onreading;
  attribute EventHandler onactivate;
  attribute EventHandler onerror;
};

dictionary SensorOptions {
  double? frequency;
};

A Sensor object has an associated sensor.

The task source for the tasks mentioned in this specification is the sensor task source.

In the following example, we construct accelerometer sensor and add event listeners to get events for sensor activation, error conditions and notifications about newly available sensor readings. The example measures and logs maximum total acceleration of a device hosting the sensor.

The event handler event types for the corresponding Sensor Interface's event handler attributes are defined in Event handlers section.

let acl = new Accelerometer({frequency: 30});
let max_magnitude = 0;
acl.addEventListener('activate', () => console.log('Ready to measure.'));
acl.addEventListener('error', error => console.log('Error: ' + error.name));
acl.addEventListener('reading', () => {
    let magnitude = Math.hypot(acl.x, acl.y, acl.z);
    if (magnitude > max_magnitude) {
        max_magnitude = magnitude;
        console.log(`Max magnitude: ${max_magnitude} m/s2`);
    }
});
acl.start();

8.1.1. Sensor lifecycle

Sensor lifecycle idle idle activating activating idle->activating start() start() activating->idle onerror activated activated activating->activated activated->idle stop() / onerror start start->idle construct

When a Sensor object is eligible for garbage collection, the user agent must invoke deactivate a sensor object with this object as argument.

8.1.2. Sensor internal slots

Instances of Sensor are created with the internal slots described in the following table:

Internal Slot Description (non-normative)
[[state]] The current state of Sensor object which is one of "idle", "activating", or "activated". It is initially "idle".
[[desiredSamplingFrequency]] The requested sampling frequency. It is initially unset.
[[lastEventFiredAt]] the high resolution timestamp of the latest sensor reading that was sent to observers of the Sensor object, expressed in milliseconds that passed since the time origin. It is initially null.
[[pendingReadingNotification]] A boolean which indicates whether the observers need to be notified after a new sensor reading was reported. It is initially false.
[[identifyingParameters]] A sensor type-specific group of dictionary members used to select the correct sensor to associate to this Sensor object.

8.1.3. Sensor.activated

The getter of the activated attribute must run these steps:

  1. If this.[[state]] is "activated", return true.

  2. Otherwise, return false.

8.1.4. Sensor.timestamp

The getter of the timestamp attribute returns the result of invoking get value from latest reading with this and "timestamp" as arguments.

8.1.5. Sensor.start()

The start() method must run these steps:

  1. Let sensor_state be the value of sensor_instance.[[state]].

  2. If sensor_state is either "activating" or "activated", then return.

  3. Set sensor_instance.[[state]] to "activating".

  4. Run these sub-steps in parallel:

    1. let connected be the result of invoking connect to sensor.

    2. If connected is false, then

      1. Let e be the result of creating a "NotReadableError" DOMException.

      2. Queue a task to run notify error with e and sensor_instance as arguments.

      3. Return.

    3. Let permission_state be the result of invoking request sensor access with sensor_instance as argument.

    4. If permission_state is "granted",

      1. Invoke activate a sensor object with sensor_instance as argument.

    5. Otherwise, if permission_state is "denied",

      1. let e be the result of creating a "NotAllowedError" DOMException.

      2. Queue a task to run notify error with e and sensor_instance as arguments.

8.1.6. Sensor.stop()

The stop() method must run these steps:

  1. If sensor_instance.[[state]] is "idle", then return.

  2. Set sensor_instance.[[state]] to "idle".

  3. Run these sub-steps in parallel:

    1. Invoke deactivate a sensor object with sensor_instance as argument.

8.1.7. Sensor.onreading

onreading is an EventHandler which is called to notify that new reading is available.

8.1.8. Sensor.onactivate

onactivate is an EventHandler which is called when this.[[state]] transitions from "activating" to "activated".

8.1.9. Sensor.onerror

onerror is an EventHandler which is called whenever an exception cannot be handled synchronously.

8.1.10. Event handlers

The following are the event handlers (and their corresponding event handler event types) that must be supported as attributes by the objects implementing the Sensor interface:

event handler event handler event type
onreading reading
onactivate activate
onerror error

8.2. The SensorErrorEvent Interface

[Constructor(DOMString type, SensorErrorEventInit errorEventInitDict),
 SecureContext, Exposed=Window]
interface SensorErrorEvent : Event {
  readonly attribute Error error;
};

dictionary SensorErrorEventInit : EventInit {
  required Error error;
};

8.2.1. SensorErrorEvent.error

Gets the Error object passed to SensorErrorEventInit.

9. Abstract Operations

9.1. Construct sensor object

input

options, a SensorOptions object.

output

A Sensor object.

  1. If the current settings object is not a secure context, then:

    1. throw a SecurityError.

  2. If the browsing context is not a top-level browsing context, then:

    1. throw a SecurityError.

  3. Let sensor_instance be a new Sensor object,

  4. If options.frequency is present, then

    1. Set sensor_instance.[[desiredSamplingFrequency]] to options.frequency.

    Note: there is not guarantee that the requested options.frequency can be respected. The actual frequency can be calculated using Sensor timestamp attributes.

  5. If identifying parameters in options are set, then:

    1. Set sensor_instance.[[identifyingParameters]] to identifying parameters.

  6. Set sensor_instance.[[state]] to "idle".

  7. Return sensor_instance.

9.2. Connect to sensor

input

sensor_instance, a Sensor object.

output

True if sensor instance was associated with a sensor, false otherwise.

  1. If sensor_instance.[[identifyingParameters]] is set and sensor_instance.[[identifyingParameters]] allows a unique sensor to be identified, then:

    1. let sensor be that sensor,

    2. associate sensor_instance with sensor.

    3. Return true.

  2. If the sensor type of sensor_instance has an associated default sensor and there is a corresponding sensor on the device, then

    1. associate sensor_instance with default sensor.

    2. Return true.

  3. Return false.

9.3. Activate a sensor object

input

sensor_instance, a Sensor object.

output

None

  1. Let sensor be the sensor associated with sensor_instance.

  2. Append sensor_instance to sensor’s set of activated sensor objects.

  3. Invoke set sensor settings with sensor as argument.

  4. Queue a task to run notify activated state with sensor_instance as an argument.

9.4. Deactivate a sensor object

input

sensor_instance, a Sensor object.

output

None

  1. Remove all tasks associated with sensor_instance from the task queue associated with sensor task source.

  2. Let sensor be the sensor associated with sensor_instance.

  3. If sensor’s set of activated sensor objects contains sensor_instance,

    1. Remove sensor_instance from sensor’s set of activated sensor objects.

    2. Invoke set sensor settings with sensor as argument.

    3. Set sensor_instance.[[pendingReadingNotification]] to false.

    4. Set sensor_instance.[[lastEventFiredAt]] to null.

9.5. Revoke sensor permission

input

sensor, a sensor.

output

None

  1. let activated_sensors be sensor’s associated set of activated sensor objects.

  2. For each s of activated_sensors,

    1. Invoke deactivate a sensor object with s as argument.

    2. let e be the result of creating a "NotAllowedError" DOMException.

    3. Queue a task to run notify error with e and s as arguments.

9.6. Set sensor settings

input

sensor, a sensor.

output

None

  1. If sensor’s set of activated sensor objects is empty,

    1. Set current sampling frequency to null.

    2. For each keyvalue of latest reading.

      1. Set latest reading[key] to null.

    3. Update the user-agent-specific way in which sensor readings are obtained from sensor to no longer provide readings.

    4. Return.

  2. Set current sampling frequency to optimal sampling frequency.

9.7. Update latest reading

input

sensor, a sensor.

reading, a sensor reading.

The timestamp needs to be specified more precisely, see issue #155.

output

None

  1. If the result of invoking the security check is "insecure", then return.

    Issue #223 on GitHub: “Should a "suspended" state be added”

    Sensors would be in this state when the security check would return "insecure".

    Additionally, or alternatively, an "suspend" event could be fired in such cases.

    Follow-up questions:

    • What should the "activated" attribute getter return in such cases?
    • Do we need a "state" or "status" attribute instead?
    • Should an "activate" event be fired once the security check returns "secure" again?
    • Etc., etc.
  2. For each keyvalue of latest reading.

    1. Set latest reading[key] to the corresponding value of reading.

    Maybe compare value with corresponding value of reading to see if there’s a change that needs to be propagated.

  3. let activated_sensors be sensor’s associated set of activated sensor objects.

  4. Run these sub-steps in parallel:

    1. For each s in activated_sensors,

      1. Invoke report latest reading updated with s as an argument.

9.8. Security check

input

None

output

A string whose value is either "secure" or "insecure".

  1. Let document be the top-level browsing context's active document.

  2. Let current_visibility_state be the result of running the steps to determine the visibility state of document.

  3. If current_visibility_state is not "visible", then return "insecure".

  4. If the currently focused area of the current top-level browsing context is a nested browsing context whose active document's origin is not same origin-domain as document’s origin, then return "insecure".

  5. Let has_focus be the result of running the has focus steps with document as argument.

  6. If has_focus is false, then return "insecure".

  7. If the user agent loses focus, then return "insecure".

    Issue #2716 on GitHub: “Focus state is unspecified when user agent itself loses focus.”

    This makes it difficult to normatively stop certain behaviors (such as collecting sensor readings) when the app is unfocused but stays visible.

    This can notably allow PIN skimming attacks using sensors.

  8. Return "secure".

Note: user agents are encouraged to stop sensor sampling when security check would return "insecure" in order to reduce resource consumption, notably battery usage.

9.9. Find the reporting frequency of a sensor object

input

sensor_instance, a Sensor object.

output

A frequency.

  1. Let frequency be null.

  2. Let f be sensor_instance.[[desiredSamplingFrequency]].

    1. if f is set,

      1. set frequency to f capped by the upper and lower current sampling frequency bounds for the associated sensor.

    2. Otherwise,

      1. user agent can assign frequency to an appropriate value.

  3. return frequency.

9.10. Report latest reading updated

input

sensor_instance, a Sensor object.

output

None

  1. If sensor_instance.[[pendingReadingNotification]] is true,

    1. Return.

  2. Set sensor_instance.[[pendingReadingNotification]] to true.

  3. Let lastReportedTimestamp be the value of sensor_instance.[[lastEventFiredAt]].

  4. If lastReportedTimestamp is not set

    1. Queue a task to run notify new reading with sensor_instance as an argument.

    2. Return.

  5. Let reportingFrequency be result of invoking Find the reporting frequency of a sensor object.

  6. If reportingFrequency is null

    1. Queue a task to run notify new reading with sensor_instance as an argument.

    2. Return.

  7. Let reportingInterval be the result of 1 / reportingFrequency.

  8. Let timestampDelta be the result of latest reading["timestamp"] - lastReportedTimestamp.

  9. If timestampDelta is greater than or equal to reportingInterval

    1. Queue a task to run notify new reading with sensor_instance as an argument.

    2. Return.

  10. Let deferUpdateTime be the result of reportingInterval - timestampDelta.

  11. Spin the event loop for a period of time equal to deferUpdateTime.

  12. If sensor_instance.[[pendingReadingNotification]] is true,

    1. Queue a task to run notify new reading with sensor_instance as an argument.

9.11. Notify new reading

input

sensor_instance, a Sensor object.

output

None

  1. Set sensor_instance.[[pendingReadingNotification]] to false.

  2. Set sensor_instance.[[lastEventFiredAt]] to latest reading["timestamp"].

  3. Fire an event named "reading" at sensor_instance.

9.12. Notify activated state

input

sensor_instance, a Sensor object.

output

None

  1. Set sensor_instance.[[state]] to "activated".

  2. Fire an event named "activate" at sensor_instance.

  3. Let sensor be the sensor associated with sensor_instance.

  4. If sensor’s latest reading["timestamp"] is not null,

    1. Queue a task to run notify new reading with sensor_instance as an argument.

9.13. Notify error

input

sensor_instance, a Sensor object.

error, an exception.

output

None

  1. Set sensor_instance.[[state]] to "idle".

  2. Fire an event named "error" at sensor_instance using SensorErrorEvent with its error attribute initialized to error.

9.14. Get value from latest reading

input

sensor_instance, a Sensor object.

key, a string representing the name of the value.

output

A sensor reading value or null.

  1. If sensor_instance.[[state]] is "activated",

    1. Let readings be the latest reading of sensor_instance’s related sensor.

    2. Return readings[key].

  2. Otherwise, return null.

9.15. Request sensor access

input

sensor_instance, a Sensor object.

output

A permission state.

  1. Let sensor be the sensor associated with sensor_instance.

  2. Let permission_name be the PermissionName associated with sensor.

  3. Let state be the result of requesting permission to use permission_name.

  4. Return state.

10. Extensibility

This section is non-normative.

Its purpose is to describe how this specification can be extended to specify APIs for different sensor types.

Extension specifications are encouraged to focus on a single sensor type, exposing both high and low level as appropriate.

10.1. Security

All interfaces defined by extension specifications should only be available within a secure context.

10.2. Naming

Sensor interfaces for low-level sensors should be named after their associated sensor. So for example, the interface associated with a gyroscope should be simply named Gyroscope. Sensor interfaces for high-level sensors should be named by combining the physical quantity the sensor measures with the "Sensor" suffix. For example, a sensor measuring the distance at which an object is from it may see its associated interface called ProximitySensor.

Attributes of the Sensor subclass that hold sensor readings values should be named after the full name of these values. For example, the Thermometer interface should hold the sensor reading's value in a temperature attribute (and not a value or temp attribute). A good starting point for naming are the Quantities, Units, Dimensions and Data Types Ontologies [QUDT].

10.3. Unit

Extension specification must specify the unit of sensor readings.

As per the Technical Architecture Group’s (TAG) API Design Principles [API-DESIGN-PRINCIPLES], all time measurement should be in milliseconds. All other units should be specified using, in order of preference, and with the exception of temperature (for which Celsius should be favored over Kelvin), the International System of Units (SI), SI derived units, and Non-SI units accepted for use with the SI, as described in the SI Brochure [SI].

10.4. Exposing High-Level vs. Low-Level Sensors

So far, specifications exposing sensors to the Web platform have focused on high-level sensors APIs. [GEOLOCATION-API] [ORIENTATION-EVENT]

This was a reasonable approach for a number of reasons. Indeed, high-level sensors:

However, an increasing number of use cases such as virtual and augmented reality require low-level access to sensors, most notably for performance reasons.

Providing low-level access enables Web application developers to leverage domain-specific constraints and design more performant systems.

Following the precepts of the Extensible Web Manifesto [EXTENNNNSIBLE], extension specifications should focus primarily on exposing low-level sensor APIs, but should also expose high-level APIs when they are clear benefits in doing so.

10.5. When is Enabling Multiple Sensors of the Same Type Not the Right Choice?

TODO: provide guidance on when to:

10.6. Definition Requirements

The following definitions must be specified for each sensor type in extension specifications:

An extension specification may specify the following definitions for each sensor types:

10.7. Extending the Permission API

Provide guidance on how to extend the Permission API [PERMISSIONS] for each sensor types.

10.8. Example WebIDL

Here’s example WebIDL for a possible extension of this specification for proximity sensors.

[Constructor(optional ProximitySensorOptions proximitySensorOptions),
 SecureContext, Exposed=Window]
interface ProximitySensor : Sensor {
    readonly attribute unrestricted double distance;
};

dictionary ProximitySensorOptions : SensorOptions {
    double? min = -Infinity;
    double? max = Infinity;
    ProximitySensorPosition? position;
    ProximitySensorDirection? direction;
};

enum ProximitySensorPosition {
    "top-left",
    "top",
    "top-right",
    "middle-left",
    "middle",
    "middle-right",
    "bottom-left",
    "bottom",
    "bottom-right"
};

enum ProximitySensorDirection {
    "front",
    "rear",
    "left",
    "right",
    "top",
    "bottom"
};

11. Acknowledgements

First and foremost, I would like to thank Anssi Kostiainen for his continuous and dedicated support and input throughout the development of this specification, as well as Mikhail Pozdnyakov, Alexander Shalamov, Rijubrata Bhaumik, and Kenneth Rohde Christiansen for their invaluable implementation feedback, suggestions, and research that have helped inform the specification work.

Special thanks to Rick Waldron for driving the discussion around a generic sensor API design for the Web, sketching the original API on which this is based, providing implementation feedback from his work on Johnny-Five, and continuous input during the development of this specification.

Special thanks to Boris Smus, Tim Volodine, and Rich Tibbett for their initial work on exposing sensors to the web with consistency.

Thanks to Anne van Kesteren for his tireless help both in person and through IRC.

Thanks to Domenic Denicola and Jake Archibald for their help.

Thanks also to Frederick Hirsch and Dominique Hazaël-Massieux (via the HTML5Apps project) for both their administrative help and technical input.

Thanks to Tab Atkins for making Bikeshed and taking the time to explain its subtleties.

Thanks to Lukasz Olejnik and Maryam Mehrnezhad for their contributions around privacy and security.

The following people have greatly contributed to this specification through extensive discussions on GitHub: Anssi Kostiainen, Boris Smus, chaals, Claes Nilsson, Dave Raggett, David Mark Clements, Domenic Denicola, Dominique Hazaël-Massieux (via the HTML5Apps project), Francesco Iovine, Frederick Hirsch, gmandyam, Jafar Husain, Johannes Hund, Kris Kowal, Lukasz Olejnik, Marcos Caceres, Marijn Kruisselbrink, Mark Foltz, Mats Wichmann, Matthew Podwysocki, pablochacin, Remy Sharp, Rich Tibbett, Rick Waldron, Rijubrata Bhaumik, robman, Sean T. McBeth, smaug----, Tab Atkins Jr., Virginie Galindo, zenparsing, and Zoltan Kis.

We’d also like to thank Anssi Kostiainen, Dominique Hazaël-Massieux, Erik Wilde, and Michael[tm] Smith for their editorial input.

Conformance

Document conventions

Conformance requirements are expressed with a combination of descriptive assertions and RFC 2119 terminology. The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in the normative parts of this document are to be interpreted as described in RFC 2119. However, for readability, these words do not appear in all uppercase letters in this specification.

All of the text of this specification is normative except sections explicitly marked as non-normative, examples, and notes. [RFC2119]

Examples in this specification are introduced with the words "for example" or are set apart from the normative text with class="example", like this:

This is an example of an informative example.

Because this document doesn’t itself define APIs for specific sensor typesthat is the role of extensions to this specification—all examples are inevitably (wishful) fabrications. Although all of the sensors used a examples would be great candidates for building atop the Generic Sensor API, their inclusion in this document does not imply that the relevant Working Groups are planning to do so.

Informative notes begin with the word "Note" and are set apart from the normative text with class="note", like this:

Note, this is an informative note.

Conformant Algorithms

Requirements phrased in the imperative as part of algorithms (such as "strip any leading space characters" or "return false") are to be interpreted with the meaning of the key word ("must", "should", "may", etc) used in introducing the algorithm.

Conformance requirements phrased as algorithms or specific steps can be implemented in any manner, so long as the end result is equivalent. In particular, the algorithms defined in this specification are intended to be easy to understand and are not intended to be performant. Implementers are encouraged to optimize.

Conformance Classes

A conformant user agent must implement all the requirements listed in this specification that are applicable to user agents.

Index

Terms defined by this specification

Terms defined by reference

References

Normative References

[DOM]
Anne van Kesteren. DOM Standard. Living Standard. URL: https://dom.spec.whatwg.org/
[HR-TIME-2]
Ilya Grigorik; James Simonsen; Jatinder Mann. High Resolution Time Level 2. URL: https://w3c.github.io/hr-time/
[HTML]
Anne van Kesteren; et al. HTML Standard. Living Standard. URL: https://html.spec.whatwg.org/multipage/
[INFRA]
Anne van Kesteren; Domenic Denicola. Infra Standard. Living Standard. URL: https://infra.spec.whatwg.org/
[PAGE-VISIBILITY]
Jatinder Mann; Arvind Jain. Page Visibility (Second Edition). 29 October 2013. REC. URL: https://www.w3.org/TR/page-visibility/
[PERMISSIONS]
Mounir Lamouri; Marcos Caceres. The Permissions API. URL: https://w3c.github.io/permissions/
[RFC2119]
S. Bradner. Key words for use in RFCs to Indicate Requirement Levels. March 1997. Best Current Practice. URL: https://tools.ietf.org/html/rfc2119
[WebIDL]
Cameron McCormack; Boris Zbarsky; Tobie Langel. Web IDL. URL: https://heycam.github.io/webidl/

Informative References

[API-DESIGN-PRINCIPLES]
Domenic Denicola. API Design Principles. 29 December 2015. URL: https://w3ctag.github.io/design-principles/
[EXTENNNNSIBLE]
The Extensible Web Manifesto. 10 June 2013. URL: https://extensiblewebmanifesto.org/
[FEATURE-POLICY]
Feature Policy. Living Standard. URL: https://wicg.github.io/feature-policy/
[GEOLOCATION-API]
Andrei Popescu. Geolocation API Specification 2nd Edition. URL: http://dev.w3.org/geo/api/spec-source.html
[GYROSPEECHRECOGNITION]
Michalevsky, Y., Boneh, D. and Nakibly, G.. Gyrophone: Recognizing Speech from Gyroscope Signals. 2014. Informational. URL: https://www.usenix.org/system/files/conference/usenixsecurity14/sec14-paper-michalevsky.pdf
[MOBILESENSORS]
Manish J. Gajjar. Mobile Sensors and Context-Aware Computing. 2017. Informational.
[ORIENTATION-EVENT]
Rich Tibbett; et al. DeviceOrientation Event Specification. URL: https://w3c.github.io/deviceorientation/spec-source-orientation.html
[POWERFUL-FEATURES]
Mike West. Secure Contexts. URL: https://w3c.github.io/webappsec-secure-contexts/
[QUDT]
Ralph Hodgson; et al. QUDT - Quantities, Units, Dimensions and Data Types Ontologies. 18 March 2014. URL: http://www.qudt.org/
[RFC6454]
A. Barth. The Web Origin Concept. December 2011. Proposed Standard. URL: https://tools.ietf.org/html/rfc6454
[SI]
SI Brochure: The International System of Units (SI), 8th edition. 2014. URL: http://www.bipm.org/en/publications/si-brochure/
[STEALINGPINSVIASENSORS]
Maryam Mehrnezhad, Ehsan Toreini, Siamak F. Shahandashti, Feng Hao. Stealing PINs via mobile sensors: actual risk versus user perception. 2017. Informational. URL: https://rd.springer.com/article/10.1007/s10207-017-0369-x?wt_mc=Internal.Event.1.SEM.ArticleAuthorOnlineFirst

IDL Index

[SecureContext, Exposed=Window]
interface Sensor : EventTarget {
  readonly attribute boolean activated;
  readonly attribute DOMHighResTimeStamp? timestamp;
  void start();
  void stop();
  attribute EventHandler onreading;
  attribute EventHandler onactivate;
  attribute EventHandler onerror;
};

dictionary SensorOptions {
  double? frequency;
};

[Constructor(DOMString type, SensorErrorEventInit errorEventInitDict),
 SecureContext, Exposed=Window]
interface SensorErrorEvent : Event {
  readonly attribute Error error;
};

dictionary SensorErrorEventInit : EventInit {
  required Error error;
};