W3C Candidate Recommendation Snapshot
Copyright © 2024 World Wide Web Consortium. W3C® liability, trademark and permissive document license rules apply.
This specification describes Data Integrity cryptosuites for use when generating a digital signature using the Elliptic Curve Digital Signature Algorithm (ECDSA).
This section describes the status of this document at the time of its publication. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.
The Working Group is actively seeking implementation feedback for this specification. In order to exit the Candidate Recommendation phase, the Working Group has set the requirement of at least two independent implementations for each mandatory feature in the specification. For details on the conformance testing process, see the test suites listed in the implementation report.
Any feature with less than two independent implementations in the ECDSA Cryptosuite Implementation Report is an "at risk" feature and might be removed before the transition to W3C Proposed Recommendation.
This document was published by the Verifiable Credentials Working Group as a Candidate Recommendation Snapshot using the Recommendation track.
Publication as a Candidate Recommendation does not imply endorsement by W3C and its Members. A Candidate Recommendation Snapshot has received wide review, is intended to gather implementation experience, and has commitments from Working Group members to royalty-free licensing for implementations.
This Candidate Recommendation is not expected to advance to Proposed Recommendation any earlier than 19 January 2025.
This document was produced by a group operating under the W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.
This document is governed by the 03 November 2023 W3C Process Document.
This specification defines a cryptographic suite for the purpose of creating, and verifying proofs for ECDSA signatures in conformance with the Data Integrity [VC-DATA-INTEGRITY] specification. ECDSA signatures are specified in [FIPS-186-5] with elliptic curves P-256 and P-384 specified in [NIST-SP-800-186]. [FIPS-186-5] includes the deterministic ECDSA algorithm which is also specified in [RFC6979].
The elliptic curves P-256 and P-384 of [NIST-SP-800-186] are respectively referred to as secp256r1 and secp384r1 in [SECG2]. This notation is sometimes also used by ECDSA software libraries.
Developers are cautioned to not confuse secp256r1 terms with secp256k1 terms; the latter are from a third, different, elliptic curve, that is not used by this specification. ECDSA software libraries might not implement all of these curves, so developers need to take care when choosing an ECDSA software library for their implementation.
This specification uses either the RDF Dataset Canonicalization Algorithm [RDF-CANON] or the JSON Canonicalization Scheme [RFC8785] to transform the input document into its canonical form. It uses one of two mechanisms to digest and sign: SHA-256 [RFC6234] as the message digest algorithm and ECDSA with Curve P-256 as the signature algorithm, or SHA-384 [RFC6234] as the message digest algorithm and ECDSA with Curve P-384 as the signature algorithm.
Terminology used throughout this document is defined in the Terminology section of the Verifiable Credential Data Integrity 1.0 specification.
As well as sections marked as non-normative, all authoring guidelines, diagrams, examples, and notes in this specification are non-normative. Everything else in this specification is normative.
The key words MAY, MUST, MUST NOT, and SHOULD in this document are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.
A conforming proof is any concrete expression of the data model that complies with the normative statements in this specification. Specifically, all relevant normative statements in Sections 2. Data Model and 3. Algorithms of this document MUST be enforced.
A conforming processor is any algorithm realized as software and/or hardware that generates or consumes a conforming proof. Conforming processors MUST produce errors when non-conforming documents are consumed.
This document contains examples of JSON and JSON-LD data. Some of these examples
are invalid JSON, as they include features such as inline comments (//
)
explaining certain portions and ellipses (...
) indicating the omission of
information that is irrelevant to the example. Such parts need to be
removed if implementers want to treat the examples as valid JSON or JSON-LD.
The following sections outline the data model that is used by this specification to express verification methods, such as cryptographic public keys, and data integrity proofs, such as digital signatures.
These verification methods are used to verify Data Integrity Proofs [VC-DATA-INTEGRITY] produced using Elliptic Curve cryptographic key material that is compliant with [FIPS-186-5]. The encoding formats for these key types are provided in this section. Lossless cryptographic key transformation processes that result in equivalent cryptographic key material MAY be used during the processing of digital signatures.
The Multikey format, defined in Controlled Identifier Document 1.0, is used to express public keys for the cryptographic suites defined in this specification.
The publicKeyMultibase
property represents a Multibase-encoded Multikey
expression of a P-256 or P-384 public key.
The publicKeyMultibase
value of the verification method MUST start with the
base-58-btc prefix (z
), as defined in the
Multibase section of
Controlled Identifier Document 1.0. A Multibase-encoded ECDSA 256-bit public key value or
an ECDSA 384-bit public key value follows, as defined in the
Multikey section of
Controlled Identifier Document 1.0. Any other encoding MUST NOT be allowed.
Developers are advised to not accidentally publish a representation of a private
key. Implementations of this specification will raise errors in the event of a
Multicodec value other than 0x1200
or 0x1201
being used in a
publicKeyMultibase
value.
{ "id": "https://example.com/issuer/123#key-0", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "zDnaerx9CtbPJ1q36T5Ln5wYt3MQYeGRG5ehnPAmxcf5mDZpv" }
{ "id": "https://example.com/issuer/123#key-0", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "z82LkvCwHNreneWpsgPEbV3gu1C6NFJEBg4srfJ5gdxEsMGRJ Uz2sG9FE42shbn2xkZJh54" }
{ "@context": [ "https://www.w3.org/ns/did/v1", "https://w3id.org/security/multikey/v1" ], "id": "did:example:123", "verificationMethod": [{ "id": "https://example.com/issuer/123#key-1", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "zDnaerx9CtbPJ1q36T5Ln5wYt3MQYeGRG5ehnPAmxcf5mDZpv" }, { "id": "https://example.com/issuer/123#key-2", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "z82LkvCwHNreneWpsgPEbV3gu1C6NFJEBg4srfJ5gdxEsMGRJ Uz2sG9FE42shbn2xkZJh54" }], "authentication": [ "did:example:123#key-1" ], "assertionMethod": [ "did:example:123#key-2" ], "capabilityDelegation": [ "did:example:123#key-2" ], "capabilityInvocation": [ "did:example:123#key-2" ] }
The secretKeyMultibase
property represents a Multibase-encoded Multikey
expression of a P-256 or P-384 secret key (also sometimes referred to as a
private key).
The secretKeyMultibase
value of the verification method MUST start with the
base-58-btc prefix (z
), as defined in the
Multibase section of
Controlled Identifier Document 1.0. A Multibase-encoded ECDSA 256-bit secret key value or
an ECDSA 384-bit secret key value follows, as defined in the
Multikey section of
Controlled Identifier Document 1.0. Any other encoding MUST NOT be allowed.
Developers are advised to prevent accidental publication of a representation of a secret
key, and to not export the secretKeyMultibase
property by default, when serializing
key pairs as Multikey.
This section details the proof representation formats that are defined by this specification.
A proof contains the attributes specified in the Proofs section of [VC-DATA-INTEGRITY] with the following restrictions.
The type
property MUST be DataIntegrityProof
.
The cryptosuite
property MUST be ecdsa-rdfc-2019
,
ecdsa-jcs-2019
, or ecdsa-sd-2023
.
The value of the proofValue
property is produced according to
the cryptosuite
type and is specified in either
Section 3.2.1 Create Proof (ecdsa-rdfc-2019), or
Section 3.3.1 Create Proof (ecdsa-jcs-2019), or
Section 3.6.1 Create Base Proof (ecdsa-sd-2023), or
Section 3.6.6 Add Derived Proof (ecdsa-sd-2023).
{ "@context": [ {"myWebsite": "https://vocabulary.example/myWebsite"}, "https://www.w3.org/ns/credentials/v2" ], "myWebsite": "https://hello.world.example/", "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#zDnaepBuvsQ8cpsWrVKw8 fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "z2iAR3F2Sk3mWfYyrinKzSQpSbvfxnz9kkv7roxxumB5RZDP9JUw5QAXuchUd huiwE18hyyZTjiEreKmhH3oj9Q8" } }
The following section describes multiple Data Integrity cryptographic suites that utilize the Elliptic Curve Digital Signature Algorithm (ECDSA) [FIPS-186-5]. When generating ECDSA signatures, the signature value MUST be expressed according to section 7 of [RFC4754] (sometimes referred to as the IEEE P1363 format) and encoded according to the specific cryptosuite proof generation algorithm. All ECDSA signatures SHOULD use the deterministic variant of the algorithm defined in [FIPS-186-5].
Implementations SHOULD fetch and cache verification method information as early as possible when adding or verifying proofs. Parameters passed to functions in this section use information from the verification method — such as the public key size — to determine function parameters — such as the cryptographic hashing algorithm.
When the RDF Dataset Canonicalization Algorithm (RDFC-1.0) [RDF-CANON] is used with ECDSA algorithms, the cryptographic hashing function used by RDFC-1.0 is chosen based on the size of the associated public key. For P-256 keys, the default hashing function, SHA-2 with 256 bits of output, MUST be used. For P-384 keys, SHA-2 with 384-bits of output MUST be used, specified via the RDFC-1.0 implementation-specific parameter.
When the RDF Dataset Canonicalization Algorithm [RDF-CANON] is used, implementations of that algorithm will detect dataset poisoning by default, and abort processing upon detection.
This algorithm is used to configure a cryptographic suite to be used by the Add Proof and Verify Proof functions in Verifiable Credential Data Integrity 1.0. The algorithm takes an options object (map options) as input and returns a cryptosuite instance (struct cryptosuite).
DataIntegrityProof
, return cryptosuite.
ecdsa-rdfc-2019
then:
ecdsa-jcs-2019
then:
ecdsa-sd-2023
then:
The ecdsa-rdfc-2019
cryptographic suite takes an input document, canonicalizes
the document using the RDF Dataset Canonicalization
[RDF-CANON], and then cryptographically hashes and signs the output
resulting in the production of a data integrity proof. The algorithms in this
section also include the verification of such a data integrity proof.
The following algorithm specifies how to create a data integrity proof given an unsecured data document. Required inputs are an unsecured data document (map unsecuredDocument), and a set of proof options (map options). A data integrity proof (map), or an error, is produced as output.
The following algorithm specifies how to verify a data integrity proof given an secured data document. Required inputs are an secured data document (map securedDocument). This algorithm returns a verification result, which is a struct whose items are:
true
or false
false
; otherwise, an unsecured data document
proof
value removed.
proofValue
removed.
true
, otherwise NullThe following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.2.4 Hashing (ecdsa-rdfc-2019).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type) and a cryptosuite identifier (cryptosuite). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
DataIntegrityProof
and options.cryptosuite is not
set to the string ecdsa-rdfc-2019
,
an error MUST be raised and SHOULD convey an error type of
PROOF_TRANSFORMATION_ERROR.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.2.6 Proof Serialization (ecdsa-rdfc-2019) or Section 3.2.7 Proof Verification (ecdsa-rdfc-2019). One must use the hash algorithm appropriate in security level to the curve used, i.e., for curve P-256 one uses SHA-256 and for curve P-384 one uses SHA-384.
The required inputs to this algorithm are a transformed data document (transformedDocument) and canonical proof configuration (canonicalProofConfig). A single hash data value represented as series of bytes is produced as output.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
DataIntegrityProof
and/or
proofConfig.cryptosuite is not set to ecdsa-rdfc-2019
, an
error MUST be raised and SHOULD convey an error type of
PROOF_GENERATION_ERROR.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData), a digital signature (proofBytes) and proof options (options). A verification result represented as a boolean value is produced as output.
The ecdsa-jcs-2019
cryptographic suite takes an input document, canonicalizes
the document using the JSON Canonicalization Scheme [RFC8785], and then
cryptographically hashes and signs the output
resulting in the production of a data integrity proof. The algorithms in this
section also include the verification of such a data integrity proof.
The following algorithm specifies how to create a data integrity proof given an unsecured data document. Required inputs are an unsecured data document (map unsecuredDocument), and a set of proof options (map options). A data integrity proof (map), or an error, is produced as output.
The following algorithm specifies how to verify a data integrity proof given an secured data document. Required inputs are an secured data document (map securedDocument). This algorithm returns a verification result, which is a struct whose items are:
true
or false
true
, an unsecured data document;
otherwise Null
proof
value
removed.
proofValue
removed.
false
and skip to the last step.
true
, unsecuredDocument;
otherwise, NullThe following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.3.4 Hashing (ecdsa-jcs-2019).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type) and a cryptosuite identifier (cryptosuite). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
DataIntegrityProof
or options.cryptosuite is not
set to the string ecdsa-jcs-2019
, an error MUST be raised and SHOULD
convey an error type of
PROOF_TRANSFORMATION_ERROR.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.3.6 Proof Serialization (ecdsa-jcs-2019) or Section 3.3.7 Proof Verification (ecdsa-jcs-2019). One must use the hash algorithm appropriate in security level to the curve used, i.e., for curve P-256 one uses SHA-256, and for curve P-384 one uses SHA-384.
The required inputs to this algorithm are a transformed data document (transformedDocument) and a canonical proof configuration (canonicalProofConfig). A single hash data value represented as series of bytes is produced as output.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are the proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
DataIntegrityProof
and/or
proofConfig.cryptosuite is not set to ecdsa-jcs-2019
,
an error MUST be raised and SHOULD convey an error type of
PROOF_GENERATION_ERROR.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData), a digital signature (proofBytes), and proof options (options). A verification result represented as a boolean value is produced as output.
The Working Group is seeking implementer feedback on these generalized selective disclosure functions as well as horizonal security review on the features from parties at W3C and IETF. Those reviews might result in significant changes to these functions, migration of these functions to the core Data Integrity specification (for use by other cryptographic suites), or the removal of the algorithm from the specification during the Candidate Recommendation phase.
The following section contains a set of functions that are used throughout cryptographic suites that perform selective disclosure.
The following algorithm canonicalizes an array of N-Quad [N-QUADS] strings and replaces any blank node identifiers in the canonicalized result using a label map factory function, labelMapFactoryFunction. The required inputs are an array of N-Quad strings (nquads), and a label map factory function (labelMapFactoryFunction). Any custom options can also be passed. An N-Quads representation of the canonicalNQuads as an array of N-Quad strings, with the replaced blank node labels, and a map from the old blank node IDs to the new blank node IDs, labelMap, is produced as output.
The following algorithm canonicalizes a JSON-LD document and replaces any blank node identifiers in the canonicalized result using a label map factory function, labelMapFactoryFunction. The required inputs are a JSON-LD document (document) and a label map factory function (labelMapFactoryFunction). Additional custom options (such as a document loader) can also be passed. An N-Quads representation of the canonicalNQuads as an array of N-Quad strings, with the replaced blank node labels, and a map from the old blank node IDs to the new blank node IDs, labelMap, is produced as output.
The following algorithm creates a label map factory function that uses an input label map to replace canonical blank node identifiers with another value. The required input is a label map, labelMap. A function, labelMapFactoryFunction, is produced as output.
The following algorithm creates a label map factory function that uses an HMAC to replace canonical blank node identifiers with their encoded HMAC digests. The required input is an HMAC (previously initialized with a secret key), HMAC. A function, labelMapFactoryFunction, is produced as output.
A different primitive could be created that returned a label map factory function that would instead sort the resulting HMAC digests and assign labels in the produced label map using a prefix and integers based on their sorted order. This primitive might be useful for selective disclosure schemes, such as BBS, that favor unlinkability over minimizing unrevealed data leakage.
The following algorithm replaces all blank node identifiers in an array of N-Quad strings with custom scheme URNs. The required inputs are an array of N-Quad strings (inputNQuads) and a URN scheme (urnScheme). An array of N-Quad strings, skolemizedNQuads, is produced as output. This operation is intended to be reversible through the use of the algorithm in Section 3.4.6 deskolemizeNQuads.
s1.replace(/(_:([^\s]+))/g, '<urn:custom-scheme:$2>')
.
The following algorithm replaces all custom scheme URNs in an array of N-Quad statements with a blank node identifier. The required inputs are an array of N-Quad strings (inputNQuads) and a URN scheme (urnScheme). An array of N-Quad strings, deskolemizedNquads, is produced as output. This operation is intended to reverse use of the algorithm in Section 3.4.6 deskolemizeNQuads.
s1.replace(/(<urn:custom-scheme:([^>]+)>)/g, '_:$2').
.
The following algorithm replaces all blank node identifiers in an expanded JSON-LD document with custom-scheme URNs, including assigning such URNs to blank nodes that are unlabeled. The required inputs are an expanded JSON-LD document (expanded), a custom URN scheme (urnScheme), a UUID string or other comparably random string (randomString), and reference to a shared integer (count). Any additional custom options (such as a document loader) can also be passed. It produces the expanded form of the skolemized JSON-LD document (skolemizedExpandedDocument as output. The skolemization used in this operation is intended to be reversible through the use of the algorithm in Section 3.4.9 toDeskolemizedNQuads.
_:b0
, the blank node
identifier is b0
).
The following algorithm replaces all blank node identifiers in a compact JSON-LD document with custom-scheme URNs. The required inputs are a compact JSON-LD document (document) and a custom URN scheme (urnScheme) which defaults to "custom-scheme:". The document is assumed to use only one @context property at the top level of the document. Any additional custom options (such as a document loader) can also be passed. It produces both an expanded form of the skolemized JSON-LD document (skolemizedExpandedDocument and a compact form of the skolemized JSON-LD document (skolemizedCompactDocument) as output. The skolemization used in this operation is intended to be reversible through the use of the algorithm in Section 3.4.9 toDeskolemizedNQuads which uses the same custom URN scheme (urnScheme).
The following algorithm converts a skolemized JSON-LD document, such as one created using the algorithm in Section 3.4.8 skolemizeCompactJsonLd, to an array of deskolemized N-Quads. The required input is a JSON-LD document, skolemizedDocument. Additional custom options (such as a document loader) can be passed. An array of deskolemized N-Quad strings (deskolemizedNQuads) is produced as output.
The following algorithm converts a JSON Pointer [RFC6901] to an array of paths into a JSON tree. The required input is a JSON Pointer string (pointer). An array of paths (paths) is produced as output.
pointer.split('/').slice(1)
~
, then add path to paths, converting it to
an integer if it parses as one, leaving it as a string if it does not.
The following algorithm creates an initial selection (a fragment of a JSON-LD document) based on a JSON-LD object. This is a helper function used within the algorithm in Section 3.4.13 selectJsonLd. The required input is a JSON-LD object (source). A JSON-LD document fragment object (selection) is produced as output.
id
that is not a blank node identifier, set
selection.id to its value. Note: All non-blank node identifiers in the path of
any JSON Pointer MUST be included in the selection, this includes any root
document identifier.
type
s in the path of any JSON Pointer,
including any root document type
.
The following algorithm selects a portion of a compact JSON-LD document using paths parsed from a parsed JSON Pointer. This is a helper function used within the algorithm in Section 3.4.13 selectJsonLd. The required inputs are an array of paths (paths) parsed from a JSON Pointer, a compact JSON-LD document (document), a selection document (selectionDocument) to be populated, and an array of arrays (arrays) for tracking selected arrays. This algorithm produces no output; instead it populates the given selectionDocument with any values selected via paths.
{...selectedValue,
…deepCopy(value)}
.
The following algorithm selects a portion of a compact JSON-LD document using
an array of JSON Pointers. The required inputs are an array of JSON Pointers
(pointers) and a compact JSON-LD document (document). The
document is assumed to use a JSON-LD context that aliases @id
and @type
to id
and type
, respectively, and to use only one
@context
property at the top level of the document. A new JSON-LD
document that represents a selection (selectionDocument) of the
original JSON-LD document is produced as output.
null
. This indicates nothing has been selected
from the original document.
@context
property in selectionDocument to
a copy of the value of the @context
property in document.
The following algorithm relabels the blank node identifiers in an array of N-Quad strings using a blank node label map. The required inputs are an array of N-Quad strings (nquads) and a blank node label map (labelMap). An array of N-Quad strings with relabeled blank node identifiers (relabeledNQuads) is produced as output.
The following algorithm selects a portion of a skolemized compact JSON-LD
document using an array of JSON Pointers, and outputs the resulting canonical
N-Quads with any blank node labels replaced using the given label map. The
required inputs are an array of JSON Pointers (pointers), a
skolemized compact JSON-LD document (skolemizedCompactDocument),
and a blank node label map (labelMap). Additional custom options
(such as a document loader) can be passed. The document is assumed
to use a JSON-LD context that aliases @id
and @type
to id
and type
,
respectively, and to use only one @context
property at the top level
of the document. An object containing the new JSON-LD document that represents
a selection of the original JSON-LD document (selectionDocument), an
array of deskolemized N-Quad strings (deskolemizedNQuads), and an
array of canonical N-Quads with replacement blank node labels (nquads)
is produced as output.
The following algorithm is used to output canonical N-Quad strings that match custom selections of a compact JSON-LD document. It does this by canonicalizing a compact JSON-LD document (replacing any blank node identifiers using a label map) and grouping the resulting canonical N-Quad strings according to the selection associated with each group. Each group will be defined using an assigned name and array of JSON pointers. The JSON pointers will be used to select portions of the skolemized document, such that the output can be converted to canonical N-Quads to perform group matching.
The required inputs are a compact JSON-LD document (document),
a label map factory function (labelMapFactoryFunction), and a map
of named group definitions (groupDefinitions). Additional custom
options (such as a document loader) can be passed. The document is
assumed to use a JSON-LD context that aliases @id
and @type
to id
and
type
, respectively, and to use only one @context
property at the top
level of the document. An object containing the created groups
(groups), the skolemized compact JSON-LD document
(skolemizedCompactDocument), the skolemized expanded JSON-LD
document (skolemizedExpandedDocument), the deskolemized N-Quad
strings (deskolemizedNQuads), the blank node label map
(labelMap), and the canonical N-Quad strings nquads, is
produced as output.
The following algorithm cryptographically hashes an array of mandatory to disclose N-Quads using a provided hashing API. The required input is an array of mandatory to disclose N-Quads (mandatory) and a hashing function (hasher). A cryptographic hash (mandatoryHash) is produced as output.
bytes
to the UTF-8 representation of the joined mandatory
N-Quads.
mandatoryHash
to the result of using hasher
to hash bytes
.
mandatoryHash
.
The Working Group is seeking implementer feedback on these cryptographic suite functions as well as horizonal security review on the feature from parties at W3C and IETF. Those reviews might result in significant changes to these algorithms, or the removal of the algorithms from the specification during the Candidate Recommendation phase.
This section contains subalgorithms that are useful to the ecdsa-sd-2023
cryptographic suite.
The following algorithm serializes the data that is to be signed by the private key associated with the base proof verification method. The required inputs are the proof options hash (proofHash), the proof-scoped multikey-encoded public key (publicKey), and the mandatory hash (mandatoryHash). A single sign data value, represented as series of bytes, is produced as output.
The following algorithm serializes the base proof value, including the base signature, public key, HMAC key, signatures, and mandatory pointers. The required inputs are a base signature baseSignature, a public key publicKey, an HMAC key hmacKey, an array of signatures, and an array of mandatoryPointers. A single base proof string value is produced as output.
u
" and ending with the base64url-no-pad-encoded value of
proofValue.
The following algorithm parses the components of an ecdsa-sd-2023
selective
disclosure base proof value. The required inputs are a proof value
(proofValue). A single object parsed base proof, containing
five elements, using the names baseSignature
, publicKey
, hmacKey
,
signatures
, and mandatoryPointers
, is produced as output.
u
, indicating that it is
a multibase-base64url-no-pad-encoded value, an error MUST be raised
and SHOULD convey an error type of
PROOF_VERIFICATION_ERROR.
u
in proofValue.
0xd9
, 0x5d
, and 0x00
, an error MUST be raised and SHOULD
convey an error type of
PROOF_VERIFICATION_ERROR.
baseSignature
, publicKey
, hmacKey
, signatures
, and mandatoryPointers
,
respectively.
The following algorithm creates data to be used to generate a derived proof. The inputs include a JSON-LD document (document), an ECDSA-SD base proof (proof), an array of JSON pointers to use to selectively disclose statements (selectivePointers), and any custom JSON-LD API options, such as a document loader). A single object, disclosure data, is produced as output, which contains the "baseSignature", "publicKey", "signatures" for "filteredSignatures", "labelMap", "mandatoryIndexes", and "revealDocument" fields.
"mandatory"
and value of mandatoryPointers, key of the string
"selective"
and value of selectivePointers, and key of the string "combined"
and value of combinedPointers.
0
.
signatures
for filteredSignatures, verifierLabelMap
for labelMap,
mandatoryIndexes, and revealDocument.
The following algorithm compresses a label map. The required inputs are label map (labelMap). The output is a compressed label map.
The following algorithm decompresses a label map. The required input is a compressed label map (compressedLabelMap). The output is a decompressed label map.
The following algorithm serializes a derived proof value. The required inputs are a base signature (baseSignature), public key (publicKey), an array of signatures (signatures), a label map (labelMap), and an array of mandatory indexes (mandatoryIndexes). A single derived proof value, serialized as a byte string, is produced as output.
0xd9
, 0x5d
, and 0x01
.
u
" and ending with the base64url-no-pad-encoded value of
proofValue.
The following algorithm parses the components of the derived proof value. The required input is a derived proof value (proofValue). A single derived proof value value object is produced as output, which contains a set to five elements, using the names "baseSignature", "publicKey", "signatures", "labelMap", and "mandatoryIndexes".
u
, indicating that it is a
multibase-base64url-no-pad-encoded value, an error MUST be raised
and SHOULD convey an error type of
PROOF_VERIFICATION_ERROR.
u
in proofValue.
0xd9
, 0x5d
, and 0x01
, an error MUST be raised
and SHOULD convey an error type of
PROOF_VERIFICATION_ERROR.
The following algorithm creates the data needed to perform verification of an ECDSA-SD-protected verifiable credential. The inputs include a JSON-LD document (document), an ECDSA-SD disclosure proof (proof), and any custom JSON-LD API options, such as a document loader. A single verify data object value is produced as output containing the following fields: "baseSignature", "proofHash", "publicKey", "signatures", "nonMandatory", and "mandatoryHash".
The Working Group is seeking implementer feedback on this cryptographic suite as well as horizonal security review on the feature from parties at W3C and IETF. Those reviews might result in significant changes to this algorithm, or the removal of the algorithm from the specification during the Candidate Recommendation phase.
The ecdsa-sd-2023
cryptographic suite takes an input document, canonicalizes
the document using the RDF Dataset Canonicalization
[RDF-CANON], and then cryptographically hashes and signs the output
resulting in the production of a data integrity proof. The algorithms in this
section also include the verification of such a data integrity proof.
The following algorithm specifies how to create a data integrity proof given an unsecured data document. Required inputs are an unsecured data document (map unsecuredDocument), and a set of proof options (map options). A data integrity proof (map), or an error, is produced as output.
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.6.3 Base Proof Hashing (ecdsa-sd-2023).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type), a cryptosuite identifier (cryptosuite), and a verification method (verificationMethod). The transformation options MUST contain an array of mandatory JSON pointers (mandatoryPointers) and MAY contain additional options, such as a JSON-LD document loader. A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
mandatoryPointers
set to mandatoryPointers,
mandatory
set to mandatory, nonMandatory
set to nonMandatory,
and hmacKey
set to hmacKey.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.6.5 Base Proof Serialization (ecdsa-sd-2023).
The required inputs to this algorithm are a transformed data document (transformedDocument) and canonical proof configuration (canonicalProofConfig). A hash data value represented as an object is produced as output.
proofHash
and mandatoryHash as mandatoryHash
to that
object.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the base proof hashing algorithm.
The required inputs to this algorithm are proof options (options) and the unsecured data document (unsecuredDocument). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
DataIntegrityProof
and/or
proofConfig.cryptosuite is not set to ecdsa-sd-2023
,
an error MUST be raised and SHOULD convey an error type of
PROOF_GENERATION_ERROR.
The following algorithm specifies how to create a base proof; called by an issuer of an ECDSA-SD-protected Verifiable Credential. The base proof is to be given only to the holder, who is responsible for generating a derived proof from it, exposing only selectively disclosed details in the proof to a verifier. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
2
for an even y
coordinate and 3
for an odd one followed by the x
coordinate of the public key).
The following algorithm creates a selective disclosure derived proof; called by
a holder of an ecdsa-sd-2023
-protected verifiable credential.
The derived proof is to be given to the verifier. The inputs include a
JSON-LD document (document), an ECDSA-SD base proof
(proof), an array of JSON pointers to use to selectively disclose
statements (selectivePointers), and any custom JSON-LD API options,
such as a document loader. A single selectively revealed document
value, represented as an object, is produced as output.
The following algorithm attempts verification of an ecdsa-sd-2023
derived
proof. This algorithm is called by a verifier of an ECDSA-SD-protected
verifiable credential. The inputs include a JSON-LD document
(document), an ECDSA-SD disclosure proof (proof), and any
custom JSON-LD API options, such as a document loader. This algorithm returns
a verification result:
proof
value removed.
false
, set verified to false.
false
, set verified to false.
true
, otherwise Null
This section is non-normative.
Before reading this section, readers are urged to familiarize themselves with general security advice provided in the Security Considerations section of the Data Integrity specification.
The integrity and authenticity of a secured document that is protected by this cryptographic suite is dependent on a number of factors including the following:
In the following sections, we review these important points and direct the reader to additional information.
This section is non-normative.
The ECDSA signature scheme has the EUF-CMA (existential unforgeability under chosen message attacks) security property. This property guarantees that any efficient adversary who has the public key pk of the signer and received an arbitrary number of signatures on messages of its choice (in an adaptive manner) cannot output a valid signature for a new message (except with negligible probability).
SUF-CMA (strong unforgeability under chosen message attacks) is a stronger notion than EUF-CMA. It guarantees that for any efficient adversary who has the public key pk of the signer and received an arbitrary number of signatures on messages of its choice, it cannot output a new valid signature pair for a new message nor a new signature for an old message (except with negligible probability). ECDSA signature scheme does not have the SUF-CMA property, while other schemes such as EdDSA [FIPS-186-5] do.
Per [NIST-SP-800-57-Part-1] in the absence of large scale quantum computers a security strength level of 128 bits requires a key size of approximately 256 bits while a security strength level of 192 bits requires a key size of 384 bits. [NIST-SP-800-186] recommendations includes curves P-256 and P-384 at these respective security strength levels.
This section is non-normative.
The ECDSA algorithm as detailed in [FIPS-186-5] states: "A new secret random number k, 0 < k < n, shall be generated prior to the generation of each digital signature for use during the signature generation process." The failure to properly generate this k value has lead to some highly publicized integrity breaches in widely deployed systems. To counter this problem, a hash-based method of determining the secret number k, called deterministic ECDSA, is given in [FIPS-186-5] and [RFC6979].
Verification of a ECDSA signature is independent of the method of generating k. Hence it is generally recommended to use deterministic ECDSA unless other requirements dictate otherwise. For example, using different k values results in different signature values for the same document which might be a desirable property in some privacy enhancing situations.
This section is non-normative.
The security of the ECDSA algorithm is dependent on the quality and protection of its private signing key. Guidance in the management of cryptographic keys is a large subject and the reader is referred to [NIST-SP-800-57-Part-1] for more extensive recommendations and discussion. As strongly recommended in both [FIPS-186-5] and [NIST-SP-800-57-Part-1], an ECDSA private signing key is not to be used for any other purpose than ECDSA signatures.
ECDSA private signing keys and public verification keys are strongly advised to have limited cryptoperiods [NIST-SP-800-57-Part-1], where a cryptoperiod is "the time span during which a specific key is authorized for use by legitimate entities or the keys for a given system will remain in effect." [NIST-SP-800-57-Part-1] gives extensive guidance on cryptoperiods for different key types under different situations and generally recommends a 1-3 year cryptoperiod for a private signing key.
To deal with potential private key compromises, [NIST-SP-800-57-Part-1] gives recommendations for protective measures, harm reduction, and revocation. Although we have been emphasizing the security of the private signing key, assurance of public key validity is highly recommended on all public keys before using them, per [NIST-SP-800-57-Part-1].
Before reading this section, readers are urged to familiarize themselves with general privacy advice provided in the Privacy Considerations section of the Data Integrity specification.
The following section describes privacy considerations that developers implementing this specification should be aware of in order to avoid violating privacy assumptions.
The cryptographic suites described in this specification do not support unlinkable disclosure. If unlinkable disclosure is of interest, the Data Integrity BBS Cryptosuites v1.0 specification provides an unlinkable digital signature mechanism.
This section is non-normative.
All test vectors are produced using deterministic ECDSA. The implementation was validated against the test vectors in [RFC6979].
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key, are shown below.
{ "publicKeyMultibase": "zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "secretKeyMultibase": "z42twTcNeSYcnqg1FLuSFs2bsGH3ZqbRHFmvS9XMsYhjxvHN" }
Signing begins with a credential without an attached proof, which is converted to canonical form, which is then hashed, as shown in the following three examples.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": ["VerifiableCredential", "AlumniCredential"], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" } }
<did:example:abcdefgh> <https://www.w3.org/ns/credentials/examples#alumniOf> "The School of Examples" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/ns/credentials/examples#AlumniCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/description> "A minimum viable example of an Alumni Credential." . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/name> "Alumni Credential" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#credentialSubject> <did:example:abcdefgh> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#issuer> <https://vc.example/issuers/5678> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#validFrom> "2023-01-01T00:00:00Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
517744132ae165a5349155bef0bb0cf2258fff99dfe1dbd914b938d775a36017
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ] }
_:c14n0 <http://purl.org/dc/terms/created> "2023-02-24T23:36:38Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> . _:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> . _:c14n0 <https://w3id.org/security#cryptosuite> "ecdsa-rdfc-2019"^^<https://w3id.org/security#cryptosuiteString> . _:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> . _:c14n0 <https://w3id.org/security#verificationMethod> <did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP> .
3a8a522f689025727fb9d1f0fa99a618da023e8494ac74f51015d009d35abc2e
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base-58-btc encode the signature.
3a8a522f689025727fb9d1f0fa99a618da023e8494ac74f51015d009d35abc2e517744132ae165a5349155bef0bb0cf2258fff99dfe1dbd914b938d775a36017
1cb4290918ffb04a55ff7ae1e55e316a9990fda8eec67325eac7fcbf2ddf9dd2b06716a657e72b284c9604df3a172ecbf06a1a475b49ac807b1d9162df855636
zaHXrr7AQdydBk3ahpCDpWbxfLokDqmCToYm2dyWvpcFVyWooC2he63w1f7UNQoAMKdhaRtcnaE2KTo5o5vTCcfw
Assemble the signed credential with the following two steps:
proofValue
field with the previously computed base-58-btc
value to the proof options document.
proof
field of the credential to the augmented proof
option document.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": [ "VerifiableCredential", "AlumniCredential" ], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "zaHXrr7AQdydBk3ahpCDpWbxfLokDqmCToYm2dyWvpcFVyWooC2he63w1f7UNQoAMKdhaRtcnaE2KTo5o5vTCcfw" } }
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key, are shown below.
{ "publicKeyMultibase": "z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "secretKeyMultibase": "z2fanyY7zgwNpZGxX5fXXibvScNaUWNprHU9dKx7qpVj7mws9J8LLt4mDB5TyH2GLHWkUc" }
Signing begins with a credential without an attached proof, which is converted to canonical form, and then hashed, as shown in the following three examples.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": ["VerifiableCredential", "AlumniCredential"], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" } }
<did:example:abcdefgh> <https://www.w3.org/ns/credentials/examples#alumniOf> "The School of Examples" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/ns/credentials/examples#AlumniCredential> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/description> "A minimum viable example of an Alumni Credential." . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://schema.org/name> "Alumni Credential" . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#credentialSubject> <did:example:abcdefgh> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#issuer> <https://vc.example/issuers/5678> . <urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33> <https://www.w3.org/2018/credentials#validFrom> "2023-01-01T00:00:00Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
8bf6e01df72c5b62f91b685231915ac4b8c58ea95f002c6b8f6bfafa1b251df476b56b8e01518e317dab099d3ecbff96
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "did:key:z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "proofPurpose": "assertionMethod", "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ] }
_:c14n0 <http://purl.org/dc/terms/created> "2023-02-24T23:36:38Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> . _:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> . _:c14n0 <https://w3id.org/security#cryptosuite> "ecdsa-rdfc-2019"^^<https://w3id.org/security#cryptosuiteString> . _:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> . _:c14n0 <https://w3id.org/security#verificationMethod> <did:key:z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ> .
e32805a26492eac777aa7a138f6d8da3c74e0c7be7b296dcaccf97420c3b92eaad7be6449ca565e165031567f5c7cbc1
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base-58-btc encode the signature.
e32805a26492eac777aa7a138f6d8da3c74e0c7be7b296dcaccf97420c3b92eaad7be6449ca565e165031567f5c7cbc18bf6e01df72c5b62f91b685231915ac4b8c58ea95f002c6b8f6bfafa1b251df476b56b8e01518e317dab099d3ecbff96
177ac088806c2506d49f0bfec16056a6a80ace62cd029888ad561aba22a59d192d77d9b1fc28df80dea5ee6c8bceb16f1b8bff6bd6ff2d8f8778bdde48bafa7b6cc1f914c0168b5c04499882f632deea9cb7d977e888bb0e1ee9fb20ff03b025
z967Mvv5bxtmLNqTzPZ8KmJjFmFXaAKeQNzq7GWnQkMcLtaGSSmuozE5WtJ8PipMe178B1tE28K1vsJur9bGVJhz6jgSJsRHFSQeqgH8hhjcg8gZDFJC1b9FsR5ggNmDBqHv
Assemble the signed credential with the following two steps:
proofValue
field with the previously computed base-58-btc
value to the proof options document.
proof
field of the credential to the augmented proof
option document.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": [ "VerifiableCredential", "AlumniCredential" ], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-rdfc-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "did:key:z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "proofPurpose": "assertionMethod", "proofValue": "z967Mvv5bxtmLNqTzPZ8KmJjFmFXaAKeQNzq7GWnQkMcLtaGSSmuozE5WtJ8PipMe178B1tE28K1vsJur9bGVJhz6jgSJsRHFSQeqgH8hhjcg8gZDFJC1b9FsR5ggNmDBqHv" } }
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key, are shown below.
{ "publicKeyMultibase": "zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "secretKeyMultibase": "z42twTcNeSYcnqg1FLuSFs2bsGH3ZqbRHFmvS9XMsYhjxvHN" }
Signing begins with a credential without an attached proof, which is converted to canonical form, which is then hashed, as shown in the following three examples.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": ["VerifiableCredential", "AlumniCredential"], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" } }
{"@context":["https://www.w3.org/ns/credentials/v2","https://www.w3.org/ns/credentials/examples/v2"],"credentialSubject":{"alumniOf":"The School of Examples","id":"did:example:abcdefgh"},"description":"A minimum viable example of an Alumni Credential.","id":"urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33","issuer":"https://vc.example/issuers/5678","name":"Alumni Credential","type":["VerifiableCredential","AlumniCredential"],"validFrom":"2023-01-01T00:00:00Z"}
59b7cb6251b8991add1ce0bc83107e3db9dbbab5bd2c28f687db1a03abc92f19
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-jcs-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod" }
{"created":"2023-02-24T23:36:38Z","cryptosuite":"ecdsa-jcs-2019","proofPurpose":"assertionMethod","type":"DataIntegrityProof","verificationMethod":"did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP"}
9c4b552d65fcb106b6b27ec2436d8ae81b319afc7aeaab7964b2938cd120cec3
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base-58-btc encode the signature.
9c4b552d65fcb106b6b27ec2436d8ae81b319afc7aeaab7964b2938cd120cec359b7cb6251b8991add1ce0bc83107e3db9dbbab5bd2c28f687db1a03abc92f19
29793c6684cb7c3628e09ed5f14d06a37835cc9564b50fe1829c252c6bb9d14f52513c92533508ea4b4938ed523597b40b9584f395537b87592dc60d0f5ea9b7
zq6PrUMCtqY5obCSsrQxuFJdGffCDxvFuopdZiBPUBRTFEs1VVsBZi8YiEwVWgHYrXxoV93gBHqGDBtQLPFxpZxz
Assemble the signed credential with the following three steps:
proofValue
field with the previously computed base-58-btc
value to the proof options document.
@context
field to the value of the
unsecuredDocument.@context.
proof
field of the credential to the augmented proof
option document.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": [ "VerifiableCredential", "AlumniCredential" ], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-jcs-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "zq6PrUMCtqY5obCSsrQxuFJdGffCDxvFuopdZiBPUBRTFEs1VVsBZi8YiEwVWgHYrXxoV93gBHqGDBtQLPFxpZxz" } }
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key, are shown below.
{ "publicKeyMultibase": "z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "secretKeyMultibase": "z2fanyY7zgwNpZGxX5fXXibvScNaUWNprHU9dKx7qpVj7mws9J8LLt4mDB5TyH2GLHWkUc" }
Signing begins with a credential without an attached proof, which is converted to canonical form, which is then hashed, as shown in the following three examples.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": ["VerifiableCredential", "AlumniCredential"], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" } }
{"@context":["https://www.w3.org/ns/credentials/v2","https://www.w3.org/ns/credentials/examples/v2"],"credentialSubject":{"alumniOf":"The School of Examples","id":"did:example:abcdefgh"},"description":"A minimum viable example of an Alumni Credential.","id":"urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33","issuer":"https://vc.example/issuers/5678","name":"Alumni Credential","type":["VerifiableCredential","AlumniCredential"],"validFrom":"2023-01-01T00:00:00Z"}
3e0be671cc1881035d463158c80921973dab3534d4f8dfacf4ff2725a4115eb718e49d66de0e90e7365cd6062abf2259
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-jcs-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "did:key:z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "proofPurpose": "assertionMethod" }
{"created":"2023-02-24T23:36:38Z","cryptosuite":"ecdsa-jcs-2019","proofPurpose":"assertionMethod","type":"DataIntegrityProof","verificationMethod":"did:key:z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ"}
bb8fd231f3f5441c089210bb991ca21910ecb2c62e82111f9d4de1ac0db3e6f2fc5dbe33cffc3a08ec1165b7b3a8bedd
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the ECDSA signature, and then base-58-btc encode the signature.
bb8fd231f3f5441c089210bb991ca21910ecb2c62e82111f9d4de1ac0db3e6f2fc5dbe33cffc3a08ec1165b7b3a8bedd3e0be671cc1881035d463158c80921973dab3534d4f8dfacf4ff2725a4115eb718e49d66de0e90e7365cd6062abf2259
45cf28964c4335b76efc2d3c9893735e3f2330f41928b5c18b507d1d6b6bd5a4c5ef3885b90d93ae6d79be92cf817b844dc444e4b645fa2f1f9f9251e97428edf49a412be93b348269f76cb39d20140fb621ee004e3500a5dbecd677530a146c
zR3krc7pt9Dpn1PY2u8HWPePggtzAG2SuH2ZGNrzw8Ku3QrFLWgDTjS5mCWy65ShixJEtpMwfviAFfh5xTgd5FGN1sSbpPJ5djqSZQLqQkY8KLBPNswuqKtq3bS8f9vEHm8w
Assemble the signed credential with the following three steps:
proofValue
field with the previously computed base-58-btc
value to the proof options document.
@context
field to the value of the
unsecuredDocument.@context.
proof
field of the credential to the augmented proof
option document.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", "https://www.w3.org/ns/credentials/examples/v2" ], "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33", "type": [ "VerifiableCredential", "AlumniCredential" ], "name": "Alumni Credential", "description": "A minimum viable example of an Alumni Credential.", "issuer": "https://vc.example/issuers/5678", "validFrom": "2023-01-01T00:00:00Z", "credentialSubject": { "id": "did:example:abcdefgh", "alumniOf": "The School of Examples" }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-jcs-2019", "created": "2023-02-24T23:36:38Z", "verificationMethod": "did:key:z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ#z82LkuBieyGShVBhvtE2zoiD6Kma4tJGFtkAhxR5pfkp5QPw4LutoYWhvQCnGjdVn14kujQ", "proofPurpose": "assertionMethod", "proofValue": "zR3krc7pt9Dpn1PY2u8HWPePggtzAG2SuH2ZGNrzw8Ku3QrFLWgDTjS5mCWy65ShixJEtpMwfviAFfh5xTgd5FGN1sSbpPJ5djqSZQLqQkY8KLBPNswuqKtq3bS8f9vEHm8w" } }
To demonstrate selective disclosure features including mandatory disclosure, selective disclosure, and overlap between mandatory and selective disclosure requires an input credential document with more content than previous test vectors. To avoid excessively long test vectors the starting document test vector is based on a purely ficticious windsurfing (sailing) competition scenario. In addition we break the test vectors into two groups based on those that would be generated by the issuer (base proof) and those that would be generated by the holder (derived proof).
In order to add a selective disclosure base proof to a document the issuer needs the following cryptographic key material:
The key material used for generating the add base proof test vectors is shown below. Multibase representation is use for the P-256 key pairs and the HMAC key is given as a hexadecimal string.
{ "baseKeyPair": { "publicKeyMultibase": "zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "secretKeyMultibase": "z42twTcNeSYcnqg1FLuSFs2bsGH3ZqbRHFmvS9XMsYhjxvHN" }, "proofKeyPair": { "publicKeyMultibase": "zDnaeTHfhmSaQKBc7CmdL3K7oYg3D6SC7yowe2eBeVd2DH32r", "secretKeyMultibase": "z42tqvNGyzyXRzotAYn43UhcFtzDUVdxJ7461fwrfhBPLmfY" }, "hmacKeyString": "00112233445566778899AABBCCDDEEFF00112233445566778899AABBCCDDEEFF" }
In our scenario a sailor is registering with a race organizer for a series of windsurfing races to be held over a number of days on Maui. The organizer will inspect the sailors equipment to certify that what has been declared is accurate. The sailors unsigned equipment inventory is shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "issuer": "https://vc.example/windsurf/racecommittee", "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 5.5, "sailName": "Kihei", "year": 2023 }, { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7.0, "sailName": "Lahaina", "year": 2020 }, { "size": 7.8, "sailName": "Lahaina", "year": 2023 } ], "boards": [ { "boardName": "CompFoil170", "brand": "Wailea", "year": 2022 }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] } }
In addition to let other sailors know what kinds of equipment their competitors maybe sailing on it is mandatory that each sailor disclose the year of their most recent windsurfing board and full details on two of their sails. Note that all sailors are identified by a sail number that is printed on all their equipment. This mandatory information is specified via an array of JSON pointers as shown below.
["/issuer", "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2"]
The result of applying the above JSON pointers to the sailors equipment document is shown below.
[ { "pointer": "/credentialSubject/sailNumber", "value": "Earth101" }, { "pointer": "/credentialSubject/sails/1", "value": { "size": 6.1, "sailName": "Lahaina", "year": 2023 } }, { "pointer": "/credentialSubject/boards/0/year", "value": 2022 }, { "pointer": "/credentialSubject/sails/2", "value": { "size": 7, "sailName": "Lahaina", "year": 2020 } } ]
Transformation of the unsigned document begins with canonicalizing the document as shown below.
[ "_:c14n0 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n", "_:c14n0 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:c14n0 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n1 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:c14n1 <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:c14n1 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n2 <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n", "_:c14n2 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:c14n2 <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n3 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:c14n3 <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n3 <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n4 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n", "_:c14n4 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:c14n4 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#boards> _:c14n0 .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#boards> _:c14n2 .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#sails> _:c14n1 .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#sails> _:c14n3 .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#sails> _:c14n4 .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#sails> _:c14n6 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n7 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n", "_:c14n7 <https://www.w3.org/2018/credentials#credentialSubject> _:c14n5 .\n", "_:c14n7 <https://www.w3.org/2018/credentials#issuer> <https://vc.example/windsurf/racecommittee> .\n" ]
To prevent possible information leakage from the ordering of the blank node ids these are processed through a PRF, i.e., the HMAC to give the canonized HMAC document shown below. This represents an ordered list of statements that will be subject to mandatory and selective disclosure, i.e., it is from this list that statements are grouped.
[ "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n", "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://www.w3.org/2018/credentials#credentialSubject> _:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 .\n", "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://www.w3.org/2018/credentials#issuer> <https://vc.example/windsurf/racecommittee> .\n", "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n", "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#boards> _:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#boards> _:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sails> _:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sails> _:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sails> _:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk .\n", "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sails> _:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc .\n", "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n", "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n", "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ]
The above canonical document gets grouped in to mandatory and non-mandatory statements. The final output of the selective disclosure transformation process is shown below. Each statement is now grouped as mandatory and non-mandatory and its index in the previous list of statements is remembered.
{ "mandatoryPointers": [ "/issuer", "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2" ], "mandatory": { "dataType": "Map", "value": [ [ 0, "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n" ], [ 1, "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://www.w3.org/2018/credentials#credentialSubject> _:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 .\n" ], [ 2, "_:u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY <https://www.w3.org/2018/credentials#issuer> <https://vc.example/windsurf/racecommittee> .\n" ], [ 8, "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 9, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#boards> _:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 .\n" ], [ 11, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n" ], [ 14, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sails> _:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk .\n" ], [ 15, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sails> _:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc .\n" ], [ 22, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n" ], [ 23, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n" ], [ 24, "_:uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 25, "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n" ], [ 26, "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 27, "_:ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ] ] }, "nonMandatory": { "dataType": "Map", "value": [ [ 3, "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n" ], [ 4, "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n" ], [ 5, "_:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 6, "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n" ], [ 7, "_:u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n" ], [ 10, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#boards> _:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw .\n" ], [ 12, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sails> _:u3Lv2QpFgo-YAegc1cQQKWJFW2sEjQF6FfuZ0VEoMKHg .\n" ], [ 13, "_:uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0 <https://windsurf.grotto-networking.com/selective#sails> _:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ .\n" ], [ 16, "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n" ], [ 17, "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n" ], [ 18, "_:uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 19, "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n" ], [ 20, "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n" ], [ 21, "_:ufUWJRHQ9j1jmUKHLL8k6m0CZ8g4v73gOpaM5kL3ZACQ <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ] ] }, "hmacKeyString": "00112233445566778899AABBCCDDEEFF00112233445566778899AABBCCDDEEFF" }
The next step is to create the base proof configuration and canonicalize it. This is shown in the following two examples.
{ "type": "DataIntegrityProof", "cryptosuite": "ecdsa-sd-2023", "created": "2023-08-15T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ] }
_:c14n0 <http://purl.org/dc/terms/created> "2023-08-15T23:36:38Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> . _:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> . _:c14n0 <https://w3id.org/security#cryptosuite> "ecdsa-sd-2023"^^<https://w3id.org/security#cryptosuiteString> . _:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> . _:c14n0 <https://w3id.org/security#verificationMethod> <did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP> .
In the hashing step we compute the SHA-256 hash of the canonicalized proof
options to produce the proofHash
and we compute the SHA-256 hash of the join of
all the mandatory nquads to produce the mandatoryHash
. These are shown below
in hexadecimal format.
{ "proofHash": "9c5c9b189f06cfa9d9f21a838ccb9b04316f07ad1a517bfd4955ee28c6a8229c", "mandatoryHash": "1994f6ce32483940032e56987a2a0cb6ea78b50944a77750c57f63801e30eca0" }
We compute the baseSignature
over the concatenation of the proofHash
,
proofPublicKey
, and mandatoryHash
using the issuers long term privateKey
.
We compute the signatures
array by signing each non-mandatory nquad using the
per proofPrivateKey
. These signatures, the proofPublicKey
, and
mandatoryPointers
which are fed to the final serialization step are
shown below.
{ "baseSignature": "91628c3bcce94667dc50c92c1d432d64cedc26df0f98d2ec9632936214a2f2067bc005a72b806b39992b1f7759bf1296967c461bfd319455e6539b1ae7e8fbd5", "publicKey": "zDnaeTHfhmSaQKBc7CmdL3K7oYg3D6SC7yowe2eBeVd2DH32r", "signatures": [ "29560a336e740c370d58e610a549980b567935984947089ef2f5547aff78406b2df375a1a16eff50ce8950bb0a8cd5478f1665650ca8bcde31c3533f9a19fa4c", "b2a312833d0479a3ded079a8e5237524d6660a3899f8234907849c9b22b8eacee10e8b4d66e187c4a19a0050b8dced05c5c29e502f7aabfcfd3c6785e42dd5a5", "99ea0453c235673c45c91f90331c284a4a911bd13cfc2692ae11fc4c3dadfad577d872800b884990a97ac47bb3e972f61be574726e9dfeb5a8ce386699568c8d", "6d030c724a5c300128d79582e82f0ca376c2116146b4e4e4331343f8b24c75f9024a8bc1ed19c247b497ce4e7ed1856282d27917383bd400286f9ca0d5634d93", "a501e56111a551592ff965f53a3258275f4ec3b8a9bd5c14be6f744a7dc88cd44bbb2f69af90c79b7c1595530f5692ea8d2f84f3f8c90de255e696346df2bf03", "6acdb0c7d6dc123d1287bb7cc3d0a3fb616971e1a975185a2d9c58b356441bcfa86d48dbd021cec87f12eeec03b67ee8496da8096d92a59be55feda35b5eeb9a", "7b7e1e43cfa0247c906a1634126661f26668cb4b2f9e94e391d70b9eba0b201b225647c2ccc28b396784b566545672017ae813f08d82ee69e6a47363768d9de2", "33ca240945fc178198c7dae59d20efaf9a531cf3a338e278ec9cc5743b57fd0e1b6715b02c6c2a96d6288c379ccade28c501d8cf9e594678535cb1ea6bbe01ed", "3bf1e3d2fc6cb89673a551ad82830a2b666518abe12d7dffbd40af74bf8c4e5b3356bda20b75c9a426ce73c07f5bf3a7f9cb1a2cfcd44af9520ed35e73565593", "766d87b78b2ffde737b351d1a9eba5fbec841b1e12ae9c3235045d2dae592b224382aaf887e12d54dcdcf89f95965bcaac737cc012ad52a6aba88e0da67af1ed", "39130b5d8cf7979f4ecdced20e4d9e8fb725adeafd067e9e6815101bbef702a439e9b73ea065de7a67a4c1908d1e314b38448da0dabba9eb4ef0545b8851dbe0", "5bea2d48556550f166835d7d9f74de484eeea798414b7e00a8fd93194400e690c6c8e4deb6b7fcaaaddb5a3d65a42bb567ac84649950ea7acb882a1a3558692f", "d705bf1e1b133d47d496a317e9932c2de9bc85b59a15efeb643a4fa7934dd36bcc1c89c38ced469f406b5d4c9500c9d36377c6ac38ecbb2dac1a033347e6e8d7", "bce1925c7e75791a025ad57d2e5a590f5d22c7422eb950a76ade5f47153b86a1acd00cccd3d906b26b833118e8c29e75c6188527788c6a320e3a159484fc470a" ], "mandatoryPointers": [ "/issuer", "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2" ] }
Finally, the values above are run through the algorithm of Section
3.5.2 serializeBaseProofValue to produce the proofValue
which is
used in the signed based document shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "issuer": "https://vc.example/windsurf/racecommittee", "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 5.5, "sailName": "Kihei", "year": 2023 }, { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7, "sailName": "Lahaina", "year": 2020 }, { "size": 7.8, "sailName": "Lahaina", "year": 2023 } ], "boards": [ { "boardName": "CompFoil170", "brand": "Wailea", "year": 2022 }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-sd-2023", "created": "2023-08-15T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "u2V0AhVhAkWKMO8zpRmfcUMksHUMtZM7cJt8PmNLsljKTYhSi8gZ7wAWnK4BrOZkrH3dZvxKWlnxGG_0xlFXmU5sa5-j71VgjgCQCKnLOGbY_FuM-ASpSkkOxsIR2E8n7Ml2q1UQ6tEwzi5NYIAARIjNEVWZ3iJmqu8zd7v8AESIzRFVmd4iZqrvM3e7_jlhAKVYKM250DDcNWOYQpUmYC1Z5NZhJRwie8vVUev94QGst83WhoW7_UM6JULsKjNVHjxZlZQyovN4xw1M_mhn6TFhAsqMSgz0EeaPe0Hmo5SN1JNZmCjiZ-CNJB4ScmyK46s7hDotNZuGHxKGaAFC43O0FxcKeUC96q_z9PGeF5C3VpVhAmeoEU8I1ZzxFyR-QMxwoSkqRG9E8_CaSrhH8TD2t-tV32HKAC4hJkKl6xHuz6XL2G-V0cm6d_rWozjhmmVaMjVhAbQMMckpcMAEo15WC6C8Mo3bCEWFGtOTkMxND-LJMdfkCSovB7RnCR7SXzk5-0YVigtJ5Fzg71AAob5yg1WNNk1hApQHlYRGlUVkv-WX1OjJYJ19Ow7ipvVwUvm90Sn3IjNRLuy9pr5DHm3wVlVMPVpLqjS-E8_jJDeJV5pY0bfK_A1hAas2wx9bcEj0Sh7t8w9Cj-2FpceGpdRhaLZxYs1ZEG8-obUjb0CHOyH8S7uwDtn7oSW2oCW2SpZvlX-2jW17rmlhAe34eQ8-gJHyQahY0EmZh8mZoy0svnpTjkdcLnroLIBsiVkfCzMKLOWeEtWZUVnIBeugT8I2C7mnmpHNjdo2d4lhAM8okCUX8F4GYx9rlnSDvr5pTHPOjOOJ47JzFdDtX_Q4bZxWwLGwqltYojDecyt4oxQHYz55ZRnhTXLHqa74B7VhAO_Hj0vxsuJZzpVGtgoMKK2ZlGKvhLX3_vUCvdL-MTlszVr2iC3XJpCbOc8B_W_On-csaLPzUSvlSDtNec1ZVk1hAdm2Ht4sv_ec3s1HRqeul--yEGx4SrpwyNQRdLa5ZKyJDgqr4h-EtVNzc-J-VllvKrHN8wBKtUqarqI4Npnrx7VhAORMLXYz3l59Ozc7SDk2ej7clrer9Bn6eaBUQG773AqQ56bc-oGXeemekwZCNHjFLOESNoNq7qetO8FRbiFHb4FhAW-otSFVlUPFmg119n3TeSE7up5hBS34AqP2TGUQA5pDGyOTetrf8qq3bWj1lpCu1Z6yEZJlQ6nrLiCoaNVhpL1hA1wW_HhsTPUfUlqMX6ZMsLem8hbWaFe_rZDpPp5NN02vMHInDjO1Gn0BrXUyVAMnTY3fGrDjsuy2sGgMzR-bo11hAvOGSXH51eRoCWtV9LlpZD10ix0IuuVCnat5fRxU7hqGs0AzM09kGsmuDMRjowp51xhiFJ3iMajIOOhWUhPxHCoVnL2lzc3VlcngdL2NyZWRlbnRpYWxTdWJqZWN0L3NhaWxOdW1iZXJ4Gi9jcmVkZW50aWFsU3ViamVjdC9zYWlscy8xeCAvY3JlZGVudGlhbFN1YmplY3QvYm9hcmRzLzAveWVhcngaL2NyZWRlbnRpYWxTdWJqZWN0L3NhaWxzLzI" } }
In order to create a derived proof a holder starts with a signed document
containing a base proof. The base document we will use for these test vectors is
the final example from Section A.5.1 Base Proof above. The first
step is to run the algorithm of Section 3.5.3 parseBaseProofValue to
recover baseSignature
, publicKey
, hmacKey
, signatures
, and
mandatoryPointers
as shown below.
{ "baseSignature": "91628c3bcce94667dc50c92c1d432d64cedc26df0f98d2ec9632936214a2f2067bc005a72b806b39992b1f7759bf1296967c461bfd319455e6539b1ae7e8fbd5", "proofPublicKey": "zDnaeTHfhmSaQKBc7CmdL3K7oYg3D6SC7yowe2eBeVd2DH32r", "hmacKey": "00112233445566778899aabbccddeeff00112233445566778899aabbccddeeff", "signatures": [ "29560a336e740c370d58e610a549980b567935984947089ef2f5547aff78406b2df375a1a16eff50ce8950bb0a8cd5478f1665650ca8bcde31c3533f9a19fa4c", "b2a312833d0479a3ded079a8e5237524d6660a3899f8234907849c9b22b8eacee10e8b4d66e187c4a19a0050b8dced05c5c29e502f7aabfcfd3c6785e42dd5a5", "99ea0453c235673c45c91f90331c284a4a911bd13cfc2692ae11fc4c3dadfad577d872800b884990a97ac47bb3e972f61be574726e9dfeb5a8ce386699568c8d", "6d030c724a5c300128d79582e82f0ca376c2116146b4e4e4331343f8b24c75f9024a8bc1ed19c247b497ce4e7ed1856282d27917383bd400286f9ca0d5634d93", "a501e56111a551592ff965f53a3258275f4ec3b8a9bd5c14be6f744a7dc88cd44bbb2f69af90c79b7c1595530f5692ea8d2f84f3f8c90de255e696346df2bf03", "6acdb0c7d6dc123d1287bb7cc3d0a3fb616971e1a975185a2d9c58b356441bcfa86d48dbd021cec87f12eeec03b67ee8496da8096d92a59be55feda35b5eeb9a", "7b7e1e43cfa0247c906a1634126661f26668cb4b2f9e94e391d70b9eba0b201b225647c2ccc28b396784b566545672017ae813f08d82ee69e6a47363768d9de2", "33ca240945fc178198c7dae59d20efaf9a531cf3a338e278ec9cc5743b57fd0e1b6715b02c6c2a96d6288c379ccade28c501d8cf9e594678535cb1ea6bbe01ed", "3bf1e3d2fc6cb89673a551ad82830a2b666518abe12d7dffbd40af74bf8c4e5b3356bda20b75c9a426ce73c07f5bf3a7f9cb1a2cfcd44af9520ed35e73565593", "766d87b78b2ffde737b351d1a9eba5fbec841b1e12ae9c3235045d2dae592b224382aaf887e12d54dcdcf89f95965bcaac737cc012ad52a6aba88e0da67af1ed", "39130b5d8cf7979f4ecdced20e4d9e8fb725adeafd067e9e6815101bbef702a439e9b73ea065de7a67a4c1908d1e314b38448da0dabba9eb4ef0545b8851dbe0", "5bea2d48556550f166835d7d9f74de484eeea798414b7e00a8fd93194400e690c6c8e4deb6b7fcaaaddb5a3d65a42bb567ac84649950ea7acb882a1a3558692f", "d705bf1e1b133d47d496a317e9932c2de9bc85b59a15efeb643a4fa7934dd36bcc1c89c38ced469f406b5d4c9500c9d36377c6ac38ecbb2dac1a033347e6e8d7", "bce1925c7e75791a025ad57d2e5a590f5d22c7422eb950a76ade5f47153b86a1acd00cccd3d906b26b833118e8c29e75c6188527788c6a320e3a159484fc470a" ], "mandatoryPointers": [ "/issuer", "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2" ] }
Next, the holder needs to indicate what, if anything else, they wish to reveal to the verifiers by specifying JSON pointers for selective disclosure. In our windsurfing competition scenario a sailor (the holder) has just completed their first day of racing and wishes to reveal to the general public (the verifiers) all the details of the windsurfing boards they used in the competition. These are shown below. Note that this slightly overlaps with the mandatory disclosed information which included only the year of their most recent board.
["/credentialSubject/boards/0", "/credentialSubject/boards/1"]
To produce the revealDocument
, i.e., the unsigned document that will
eventually be signed and sent to the verifier, we append the selective pointers
to the mandatory pointers and input these combined pointers along with the
document without proof to the algorithm of Section 3.4.13 selectJsonLd
to give the result shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "issuer": "https://vc.example/windsurf/racecommittee", "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7, "sailName": "Lahaina", "year": 2020 } ], "boards": [ { "year": 2022, "boardName": "CompFoil170", "brand": "Wailea" }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] } }
Now that we know what the revealed document looks like, we need to furnish appropriately updated information to the verifier on which statements are mandatory, the signatures for the selected non-mandatory statements, and the mapping between canonical blank node ids for the reveal document and a subset of the HMAC blank node ids. Running step 6 of the 3.5.4 createDisclosureData yields an abundance of information about various statement groups relative to the original document. Below we show a portion of the indexes for those groups.
{ "combinedIndexes":[0,1,2,6,7,8,9,10,11,14,15,16,17,18,22,23,24,25,26,27], "mandatoryIndexes":[0,1,2,8,9,11,14,15,22,23,24,25,26,27], "nonMandatoryIndexes":[3,4,5,6,7,10,12,13,16,17,18,19,20,21], "selectiveIndexes":[0,1,6,7,8,9,10,16,17,18] }
The verifier needs to be able to aggregate and hash the mandatory statements. To
enable this we furnish them with a list of indexes of the mandatory statements
adjusted to their positions in the reveal document. In the previous example the
combinedIndexes
show the indexes of all the original nquads (statements) that
make up the reveal document, in order. To come up with the adjusted mandatory
indexes shown below we obtain the index of each of original mandatory indexes
relative to the combinedIndexes
as shown below.
{"adjMandatoryIndexes":[0,1,2,5,6,8,9,10,14,15,16,17,18,19]}
We have to furnish the verifier with a list of signatures for those selective statements (nquads) that are not mandatory. The original list of signatures corresponds to every non-mandatory statement and the indexes of these in the original document are given above. We now compute a list of adjusted signature indexes by computing the index of each selective index in the non-mandatory index list, ignoring any selective index not present in the list. We then use the adjusted signature indexes to obtain the filtered signature list. These lists are shown below.
{ "adjSignatureIndexes": [3, 4, 5, 8, 9, 10], "filteredSignatures": [ "6d030c724a5c300128d79582e82f0ca376c2116146b4e4e4331343f8b24c75f9024a8bc1ed19c247b497ce4e7ed1856282d27917383bd400286f9ca0d5634d93", "a501e56111a551592ff965f53a3258275f4ec3b8a9bd5c14be6f744a7dc88cd44bbb2f69af90c79b7c1595530f5692ea8d2f84f3f8c90de255e696346df2bf03", "6acdb0c7d6dc123d1287bb7cc3d0a3fb616971e1a975185a2d9c58b356441bcfa86d48dbd021cec87f12eeec03b67ee8496da8096d92a59be55feda35b5eeb9a", "3bf1e3d2fc6cb89673a551ad82830a2b666518abe12d7dffbd40af74bf8c4e5b3356bda20b75c9a426ce73c07f5bf3a7f9cb1a2cfcd44af9520ed35e73565593", "766d87b78b2ffde737b351d1a9eba5fbec841b1e12ae9c3235045d2dae592b224382aaf887e12d54dcdcf89f95965bcaac737cc012ad52a6aba88e0da67af1ed", "39130b5d8cf7979f4ecdced20e4d9e8fb725adeafd067e9e6815101bbef702a439e9b73ea065de7a67a4c1908d1e314b38448da0dabba9eb4ef0545b8851dbe0" ] }
The last important piece of disclosure data is a mapping of canonical blank node
ids to HMAC based ids, the labelMap
, computed according to Section
3.5.4 createDisclosureData steps 12-14. This is shown below along with
the rest of the disclosure data minus the reveal document.
{ "baseSignature": "91628c3bcce94667dc50c92c1d432d64cedc26df0f98d2ec9632936214a2f2067bc005a72b806b39992b1f7759bf1296967c461bfd319455e6539b1ae7e8fbd5", "publicKey": "zDnaeTHfhmSaQKBc7CmdL3K7oYg3D6SC7yowe2eBeVd2DH32r", "signatures": [ "6d030c724a5c300128d79582e82f0ca376c2116146b4e4e4331343f8b24c75f9024a8bc1ed19c247b497ce4e7ed1856282d27917383bd400286f9ca0d5634d93", "a501e56111a551592ff965f53a3258275f4ec3b8a9bd5c14be6f744a7dc88cd44bbb2f69af90c79b7c1595530f5692ea8d2f84f3f8c90de255e696346df2bf03", "6acdb0c7d6dc123d1287bb7cc3d0a3fb616971e1a975185a2d9c58b356441bcfa86d48dbd021cec87f12eeec03b67ee8496da8096d92a59be55feda35b5eeb9a", "3bf1e3d2fc6cb89673a551ad82830a2b666518abe12d7dffbd40af74bf8c4e5b3356bda20b75c9a426ce73c07f5bf3a7f9cb1a2cfcd44af9520ed35e73565593", "766d87b78b2ffde737b351d1a9eba5fbec841b1e12ae9c3235045d2dae592b224382aaf887e12d54dcdcf89f95965bcaac737cc012ad52a6aba88e0da67af1ed", "39130b5d8cf7979f4ecdced20e4d9e8fb725adeafd067e9e6815101bbef702a439e9b73ea065de7a67a4c1908d1e314b38448da0dabba9eb4ef0545b8851dbe0" ], "labelMap": { "dataType": "Map", "value": [ [ "c14n0", "u4YIOZn1MHES1Z4Ij2hWZG3R4dEYBqg5fHTyDEvYhC38" ], [ "c14n1", "uVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKw" ], [ "c14n2", "uQ-qOZUDlozRsGk46ux9gp9fjT28Fy3g3nctmMoqi_U0" ], [ "c14n3", "ukR2991GJuy_Tkjem_x7pLVpS4C4GkZAcuGtiPhBfSSc" ], [ "c14n4", "uk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDk" ], [ "c14n5", "u2IE-HtO6PyHQsGnuqhO1mX6V7RkRREhF0d0sWZlxNOY" ] ] }, "mandatoryIndexes": [ 0, 1, 2, 5, 6, 8, 9, 10, 14, 15, 16, 17, 18, 19 ] }
Finally using the disclosure data above with the algorithm of Section 3.5.7 serializeDerivedProofValue we obtain the signed derived (reveal) document shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "issuer": "https://vc.example/windsurf/racecommittee", "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7, "sailName": "Lahaina", "year": 2020 } ], "boards": [ { "year": 2022, "boardName": "CompFoil170", "brand": "Wailea" }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "ecdsa-sd-2023", "created": "2023-08-15T23:36:38Z", "verificationMethod": "did:key:zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP#zDnaepBuvsQ8cpsWrVKw8fbpGpvPeNSjVPTWoq6cRqaYzBKVP", "proofPurpose": "assertionMethod", "proofValue": "u2V0BhVhAkWKMO8zpRmfcUMksHUMtZM7cJt8PmNLsljKTYhSi8gZ7wAWnK4BrOZkrH3dZvxKWlnxGG_0xlFXmU5sa5-j71VgjgCQCKnLOGbY_FuM-ASpSkkOxsIR2E8n7Ml2q1UQ6tEwzi5OGWEBtAwxySlwwASjXlYLoLwyjdsIRYUa05OQzE0P4skx1-QJKi8HtGcJHtJfOTn7RhWKC0nkXODvUAChvnKDVY02TWEClAeVhEaVRWS_5ZfU6MlgnX07DuKm9XBS-b3RKfciM1Eu7L2mvkMebfBWVUw9WkuqNL4Tz-MkN4lXmljRt8r8DWEBqzbDH1twSPRKHu3zD0KP7YWlx4al1GFotnFizVkQbz6htSNvQIc7IfxLu7AO2fuhJbagJbZKlm-Vf7aNbXuuaWEA78ePS_Gy4lnOlUa2CgworZmUYq-Etff-9QK90v4xOWzNWvaILdcmkJs5zwH9b86f5yxos_NRK-VIO015zVlWTWEB2bYe3iy_95zezUdGp66X77IQbHhKunDI1BF0trlkrIkOCqviH4S1U3Nz4n5WWW8qsc3zAEq1Spquojg2mevHtWEA5EwtdjPeXn07NztIOTZ6PtyWt6v0Gfp5oFRAbvvcCpDnptz6gZd56Z6TBkI0eMUs4RI2g2rup607wVFuIUdvgpgBYIOGCDmZ9TBxEtWeCI9oVmRt0eHRGAaoOXx08gxL2IQt_AVggVkUuBrlOaELGVQWJD4M_qW5bcKEHWGNbOrPA_qAOKKwCWCBD6o5lQOWjNGwaTjq7H2Cn1-NPbwXLeDedy2YyiqL9TQNYIJEdvfdRibsv05I3pv8e6S1aUuAuBpGQHLhrYj4QX0knBFggk0AeXgJ4e6m1XsV5-xFud0L_1mUjZ9Mffhg5aZGTyDkFWCDYgT4e07o_IdCwae6qE7WZfpXtGRFESEXR3SxZmXE05o4AAQIFBggJCg4PEBESEw" } }
This section is non-normative.
This section contains the substantive changes that have been made to this specification over time.
Changes since the First Candidate Recommendation:
created
proof option is not required and additional proof
options are included in the generated proof.
Changes since the First Public Working Draft:
secretKeymultibase
representation.
This section is non-normative.
Work on this specification has been supported by the Rebooting the Web of Trust community facilitated by Christopher Allen, Shannon Appelcline, Kiara Robles, Brian Weller, Betty Dhamers, Kaliya Young, Manu Sporny, Drummond Reed, Joe Andrieu, Heather Vescent, Kim Hamilton Duffy, Samantha Chase, Andrew Hughes, Erica Connell, Shigeya Suzuki, and Zaïda Rivai. The participants in the Internet Identity Workshop, facilitated by Phil Windley, Kaliya Young, Doc Searls, and Heidi Nobantu Saul, also supported the refinement of this work through numerous working sessions designed to educate about, debate on, and improve this specification.
The Working Group also thanks our Chair, Brent Zundel, our ex-Chair Kristina Yasuda, as well as our W3C Staff Contact, Ivan Herman, for their expert management and steady guidance of the group through the W3C standardization process.
Portions of the work on this specification have been funded by the United States Department of Homeland Security's Science and Technology Directorate under contracts 70RSAT20T00000029, 70RSAT21T00000016, 70RSAT23T00000005, 70RSAT20T00000010/P00001, 70RSAT20T00000029, 70RSAT21T00000016/P00001, 70RSAT23T00000005, 70RSAT23C00000030, 70RSAT23R00000006, 70RSAT24T00000011, and the National Science Foundation through NSF 22-572. The content of this specification does not necessarily reflect the position or the policy of the U.S. Government and no official endorsement should be inferred.
The Working Group would like to thank the following individuals for reviewing and providing feedback on the specification (in alphabetical order):
Will Abramson, Mahmoud Alkhraishi, Christopher Allen, Joe Andrieu, Bohdan Andriyiv, Anthony, George Aristy, Hadley Beeman, Greg Bernstein, Bob420, Sarven Capadisli, Melvin Carvalho, David Chadwick, Matt Collier, Gabe Cohen, Sebastian Crane, Kyle Den Hartog, Veikko Eeva, Eric Elliott, Raphael Flechtner, Julien Fraichot, Benjamin Goering, Kim Hamilton Duffy, Joseph Heenan, Helge, Ivan Herman, Michael Herman, Anil John, Andrew Jones, Michael B. Jones, Rieks Joosten, Gregory K, Gregg Kellogg, Filip Kolarik, David I. Lehn, Charles E. Lehner, Christine Lemmer-Webber, Eric Lim, Dave Longley, Tobias Looker, Jer Miller, nightpool, Luis Osta, Nate Otto, George J. Padayatti, Addison Phillips, Mike Prorock, Brian Richter, Anders Rundgren, Eugeniu Rusu, Markus Sabadello, silverpill, Wesley Smith, Manu Sporny, Patrick St-Louis, Orie Steele, Henry Story, Oliver Terbu, Ted Thibodeau Jr, John Toohey, Bert Van Nuffelen, Mike Varley, Snorre Lothar von Gohren Edwin, Jeffrey Yasskin, Kristina Yasuda, Benjamin Young, Dmitri Zagidulin, and Brent Zundel.
Referenced in:
Referenced in: