This specification describes Data Integrity cryptographic suites for use when creating or verifying a digital signature using the the Ed25519 instantiation of the Edwards-Curve Digital Signature Algorithm (EdDSA).
The Working Group is actively seeking implementation feedback for this specification. In order to exit the Candidate Recommendation phase, the Working Group has set the requirement of at least two independent implementations for each mandatory feature in the specification. For details on the conformance testing process, see the test suites listed in the implementation report.
Any feature with less than two independent implementations in the EdDSA Cryptosuite Implementation Report is an "at risk" feature and might be removed before the transition to W3C Proposed Recommendation.
This specification defines a cryptographic suite for the purpose of creating, verifying proofs for Ed25519 EdDSA signatures in conformance with the Data Integrity [[VC-DATA-INTEGRITY]] specification. The approach is accepted by the U.S. National Institute of Standards in the latest [[FIPS-186-5]] publication and meets U.S. Federal Information Processing requirements when using cryptography to secure digital information.
The suites described in this specification use the RDF Dataset Canonicalization Algorithm [[RDF-CANON]] or the JSON Canonicalization Scheme [[RFC8785]] to transform an input document into its canonical form. The canonical representation is then hashed and signed with a detached signature algorithm.
Terminology used throughout this document is defined in the Terminology section of the [[[VC-DATA-INTEGRITY]]] specification.
A conforming proof is any concrete expression of the data model that complies with the normative statements in this specification. Specifically, all relevant normative statements in Sections [[[#data-model]]] and [[[#algorithms]]] of this document MUST be enforced.
A conforming processor is any algorithm realized as software and/or hardware that generates or consumes a conforming proof. Conforming processors MUST produce errors when non-conforming documents are consumed.
This document contains examples of JSON and JSON-LD data. Some of these examples are invalid JSON, as they include features such as inline comments (`//`) explaining certain portions and ellipses (`...`) indicating the omission of information that is irrelevant to the example. These parts would have to be removed in order to treat the examples as valid JSON or JSON-LD.
The following sections outline the data model that is used by this specification to express verification methods, such as cryptographic public keys, and data integrity proofs, such as digital signatures.
This cryptographic suite is used to verify Data Integrity Proofs [[VC-DATA-INTEGRITY]] produced using Edwards Curve cryptographic key material. The encoding formats for those key types are provided in this section. Lossless cryptographic key transformation processes that result in equivalent cryptographic key material MAY be used for the processing of digital signatures.
The Multikey format, defined in [[[controller-document]]], is used to express public keys for the cryptographic suites defined in this specification.
The `publicKeyMultibase` value of the verification method MUST start with the base-58-btc prefix (`z`), as defined in the Multibase section of [[[controller-document]]]. A Multibase-encoded Ed25519 256-bit public key value follows, as defined in the Multikey section of [[[controller-document]]]. Any other encoding MUST NOT be allowed.
Developers are advised to not accidentally publish a representation of a private key. Implementations of this specification will raise errors if they encounter a Multikey prefix value other than `0xed01` in a `publicKeyMultibase` value.
{ "id": "https://example.com/issuer/123#key-0", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "z6Mkf5rGMoatrSj1f4CyvuHBeXJELe9RPdzo2PKGNCKVtZxP" }
{ "@context": [ "https://www.w3.org/ns/did/v1", "https://w3id.org/security/multikey/v1" ], "id": "did:example:123", "verificationMethod": [{ "id": "did:example:123#key-0", "type": "Multikey", "controller": "did:example:123", "publicKeyMultibase": "z6Mkf5rGMoatrSj1f4CyvuHBeXJELe9RPdzo2PKGNCKVtZxP" }], "authentication": [ "did:example:123#key-0" ], "assertionMethod": [ "did:example:123#key-0" ], "capabilityDelegation": [ "did:example:123#key-0" ], "capabilityInvocation": [ "did:example:123#key-0" ] }
The `secretKeyMultibase` value of the verification method MUST start with the base-58-btc prefix (`z`), as defined in the Multibase section of [[[controller-document]]]. A Multibase-encoded Ed25519 256-bit secret key value follows, as defined in the Multikey section of [[[controller-document]]]. Any other encoding MUST NOT be allowed.
Developers are advised to prevent accidental publication of a representation of a secret key, and to not export the `secretKeyMultibase` property by default, when serializing key pairs to Multikey.
This section details the proof representation formats that are defined by this specification.
A proof contains the attributes specified in the Proofs section of [[VC-DATA-INTEGRITY]] with the following restrictions.
The `type` property MUST be `DataIntegrityProof`.
The `cryptosuite` property of the proof MUST be `eddsa-rdfc-2022` or `eddsa-jcs-2022`.
The `proofValue` property of the proof MUST be a detached EdDSA signature produced according to [[RFC8032]], encoded using the base-58-btc header and alphabet as described in the Multibase section of [[[controller-document]]].
{ "@context": [ {"myWebsite": "https://vocabulary.example/myWebsite"}, "https://www.w3.org/ns/credentials/v2" ], "myWebsite": "https://hello.world.example/", "proof": { "type": "DataIntegrityProof", "cryptosuite": "eddsa-rdfc-2022", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#z6MkrJVnaZkeFzdQyMZu1 cgjg7k1pZZ6pvBQ7XJPt4swbTQ2", "proofPurpose": "assertionMethod", "proofValue": "z5C5b1uzYJN6pDR3aWgAqUMoSB1JY29epA74qyjaie9qh4okm9DZP6y77eTNq 5NfYyMwNu9bpQQWUHKH5zAmEtszK" } }
The following section describes multiple Data Integrity cryptographic suites that use the Edwards-Curve Digital Signature Algorithm.
This algorithm is used to configure a cryptographic suite to be used by the Add Proof and Verify Proof functions in [[[VC-DATA-INTEGRITY]]]. The algorithm takes an options object ([=map=] |options|) as input and returns a [=data integrity cryptographic suite instance|cryptosuite instance=] ([=struct=] |cryptosuite|).
The `eddsa-rdfc-2022` cryptographic suite takes an input document, canonicalizes the document using the RDF Dataset Canonicalization algorithm [[RDF-CANON]], and then cryptographically hashes and signs the output resulting in the production of a data integrity proof. The algorithms in this section also include the verification of such a data integrity proof.
When the RDF Dataset Canonicalization Algorithm [[RDF-CANON]] is used, implementations will detect dataset poisoning by default, and abort processing upon such detection.
The following algorithm specifies how to create a [=data integrity proof=] given an unsecured data document. Required inputs are an unsecured data document ([=map=] |unsecuredDocument|), and a set of proof options ([=map=] |options|). A [=data integrity proof=] ([=map=]), or an error, is produced as output.
The following algorithm specifies how to verify a [=data integrity proof=] given an secured data document. Required inputs are an secured data document ([=map=] |securedDocument|). This algorithm returns a verification result, which is a [=struct=] whose [=struct/items=] are:
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section [[[#hashing-eddsa-rdfc-2022]]].
Required inputs to this algorithm are an unsecured data document (`unsecuredDocument`) and transformation options (`options`). The transformation options MUST contain a type identifier for the cryptographic suite (`type`) and a cryptosuite identifier (`cryptosuite`). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section [[[#proof-serialization-eddsa-rdfc-2022]]] or Section [[[#proof-verification-eddsa-rdfc-2022]]].
The required inputs to this algorithm are a transformed data document (`transformedDocument`) and canonical proof configuration (`canonicalProofConfig`). A single hash data value represented as series of bytes is produced as output.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are the document (|unsecuredDocument|) and the proof options (`options`). The proof options MUST contain a type identifier for the cryptographic suite (`type`) and MUST contain a cryptosuite identifier (`cryptosuite`). A proof configuration object is produced as output.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (`hashData`) and proof options (`options`). The proof options MUST contain a type identifier for the cryptographic suite (`type`) and MAY contain a cryptosuite identifier (`cryptosuite`). A single digital proof value represented as series of bytes is produced as output.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (`hashData`), a digital signature (`proofBytes`) and proof options (`options`). A verification result represented as a boolean value is produced as output.
The `eddsa-jcs-2022` cryptographic suite takes an input document, canonicalizes the document using the JSON Canonicalization Scheme [[RFC8785]], and then cryptographically hashes and signs the output resulting in the production of a data integrity proof.
The following algorithm specifies how to create a [=data integrity proof=] given an unsecured data document. Required inputs are an unsecured data document ([=map=] |unsecuredDocument|), and a set of proof options ([=map=] |options|). A [=data integrity proof=] ([=map=]), or an error, is produced as output.
The following algorithm specifies how to verify a [=data integrity proof=] given an secured data document. Required inputs are an secured data document ([=map=] |securedDocument|). This algorithm returns a [=verification result=], which is a [=struct=] whose [=struct/items=] are:
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section [[[#hashing-eddsa-jcs-2022]]].
Required inputs to this algorithm are an unsecured data document (`unsecuredDocument`) and transformation options (`options`). The transformation options MUST contain a type identifier for the cryptographic suite (`type`) and a cryptosuite identifier (`cryptosuite`). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section [[[#proof-serialization-eddsa-jcs-2022]]] or Section [[[#proof-verification-eddsa-jcs-2022]]].
The required inputs to this algorithm are a transformed data document (`transformedDocument`) and canonical proof configuration (`canonicalProofConfig`). A single hash data value represented as series of bytes is produced as output.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are proof options (`options`). The proof options MUST contain a type identifier for the cryptographic suite (`type`) and MUST contain a cryptosuite identifier (`cryptosuite`). A proof configuration object is produced as output.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (`hashData`) and proof options (`options`). The proof options MUST contain a type identifier for the cryptographic suite (`type`) and MAY contain a cryptosuite identifier (`cryptosuite`). A single digital proof value represented as series of bytes is produced as output.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (`hashData`), a digital signature (`proofBytes`) and proof options (`options`). A verification result represented as a boolean value is produced as output.
Before reading this section, readers are urged to familiarize themselves with general security advice provided in the Security Considerations section of the Data Integrity specification.
The following section describes security considerations that developers implementing this specification should be aware of in order to create secure software.
Ed25519 signatures (EdDSA algorithm with edwards25519 curve) have been widely adopted, due both to the compact size of the keys and signatures and to the speed at which signatures can be produced and verified. Many libraries exist that can create and verify Ed25519 signatures. Since the publication of [[RFC8032]], security properties of Ed25519 signatures have been rigorously proven (see [[Provable_Ed25519]] and [[Taming_EdDSAs]]). However, it has been observed that a significant number of libraries do not achieve these security levels, due to missing input validity checks during the signature verification process. In this section, we summarize the security levels achievable with Ed25519 signatures, and indicate how one can determine whether a library will support those levels.
Digital signatures might exhibit a number of desirable cryptographic properties [[Taming_EdDSAs]] among these are:
EUF-CMA (existential unforgeability under chosen message attacks) is usually the minimal security property required of a signature scheme. It guarantees that any efficient adversary who has the public key of the signer and received an arbitrary number of signatures on messages of its choice (in an adaptive manner): , cannot output a valid signature for a new message (except with negligible probability). If the attacker outputs a valid signature on a new message: , it is called an existential forgery.
SUF-CMA (strong unforgeability under chosen message attacks) is a stronger notion than EUF-CMA. It guarantees that for any efficient adversary who has the public key of the signer and received an arbitrary number of signatures on messages of its choice: , it cannot output a new valid signature pair , such that (except with negligible probability). Strong unforgeability implies that an adversary not only cannot sign new messages, but also cannot find a new signature on an old message. See [[Provable_Ed25519]] for a real world attack that would have been circumvented with SUF-CMA security over EUF-CMA security.
Binding signature (BS) We say that a signature scheme is binding if no efficient signer can output a tuple , where both and are valid message signature pairs under the public key and (except with negligible probability). A binding signature makes it impossible for the signer to claim later that it has signed a different message; the signature binds the signer to the message.
Strongly Binding signature (SBS) Certain applications may require a signature to not only be binding to the message but also be binding to the public key. We say that a signature scheme is strongly-binding if any efficient signer cannot output a tuple , where is a valid signature for the public key and is a valid signature for the public key and either or , or both (except with negligible probability). See [[Provable_Ed25519]] for real world attacks that would have been circumvented with the SBS property.
Note that the BS and SBS properties are forms of non-repudiation.
As pointed out in [[Taming_EdDSAs]], flaws in Ed25519 libraries primarily occur on the signature verification side, where edge cases are sometimes not properly checked. An Ed25519 signature library that is in conformance with [[RFC8032]] or [[FIPS-186-5]], i.e., one that performs all specified validation checks, will have the SUF-CMA property in addition to EUF-CMA.
Reference [[Taming_EdDSAs]] achieves the BS and SBS properties along with SUF-CMA in their "signature verification algorithm 2" where an additional check is performed against the public key A to make sure that it is not one of eight "small order points". These additional checks incur minimal processing overhead.
Reference [[Taming_EdDSAs]] included a set of twelve test vectors to test various Ed25519 libraries available at the time of publication. They found that a significant portion missed edge cases and hence did not achieve SUF-CMA (just EUF-CMA), and only two libraries out of sixteen achieved all the security properties. Since the time of publication, more Ed25519 libraries have been created, and some of the libraries have been updated to include all verification checks. Implementers are recommended to test the Ed25519 library they are using against the test vectors of [[Taming_EdDSAs]].
Before reading this section, readers are urged to familiarize themselves with general privacy advice provided in the Privacy Considerations section of the Data Integrity specification.
The following section describes privacy considerations that developers implementing this specification should be aware of in order to avoid violating privacy assumptions.
The cryptographic suites described in this specification do not support [=selective disclosure=] or [=unlinkable disclosure=]. If [=selective disclosure=] is a desired feature, readers might find the [[[?VC-DI-ECDSA]]] specification useful. If [=unlinkable disclosure=] is of interest, the [[[?VC-DI-BBS]]] specification provides an unlinkable digital signature mechanism.
`Ed25519Signature2020` is an earlier version of a cryptographic suite for use of the EdDSA algorithm and Curve25519. While it has been used in production systems, new implementations should instead use `eddsa-rdfc-2022`. `Ed25519Signature2020` has been kept in this specification to provide a stable reference.
The key format described in this section is provided to document a legacy mechanism that has been deployed to production. The key format described in section [[[#multikey]]] supercedes the one described in this section. New applications are strongly urged to use the newer key format.
The `type` of the verification method MUST be Ed25519VerificationKey2020.
The `controller` of the verification method MUST be a URL.
The `publicKeyMultibase` value of the verification method MUST start with the base-58-btc prefix (`z`), as defined in the Multibase section of [[VC-DATA-INTEGRITY]]. A Multibase-encoded Multikey value follows, which MUST consist of a binary value that starts with the two-byte prefix `0xed01`, which is the Multikey header for an Ed25519 public key, followed by the 32-byte public key data, all of which is then encoded using base-58-btc. Any other encoding MUST NOT be allowed.
Developers are advised to not accidentally publish a representation of a private key. Implementations of this specification will raise errors in the event of a Multikey header value other than `0xed01` being used in a `publicKeyMultibase` value.
{ "id": "https://example.com/issuer/123#key-0", "type": "Ed25519VerificationKey2020", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "z6Mkf5rGMoatrSj1f4CyvuHBeXJELe9RPdzo2PKGNCKVtZxP" }
{ "@context": [ "https://www.w3.org/ns/did/v1", "https://w3id.org/security/suites/ed25519-2020/v1" ], "id": "did:example:123", "verificationMethod": [{ "id": "did:example:123#key-0", "type": "Ed25519VerificationKey2020", "controller": "did:example:123", "publicKeyMultibase": "z6Mkf5rGMoatrSj1f4CyvuHBeXJELe9RPdzo2PKGNCKVtZxP" }], "authentication": [ "did:example:123#key-0" ], "assertionMethod": [ "did:example:123#key-0" ], "capabilityDelegation": [ "did:example:123#key-0" ], "capabilityInvocation": [ "did:example:123#key-0" ] }
The proof format described in this section is provided to document a legacy mechanism that has been deployed to production. The `DataIntegrityProof` formats described in section [[[#dataintegrityproof]]] supercede the one described in this section. New applications are strongly urged to use the newer proof format.
The `verificationMethod` property of the proof MUST be a URL. Dereferencing the `verificationMethod` MUST result in an object containing a `type` property with the value set to `Ed25519VerificationKey2020`.
The `type` property of the proof MUST be `Ed25519Signature2020`.
The `created` property of the proof MUST be an [[XMLSCHEMA11-2]] formatted date string.
The `proofPurpose` property of the proof MUST be a string, and MUST match the verification relationship expressed by the verification method `controller`.
The `proofValue` property of the proof MUST be a detached EdDSA produced according to [[RFC8032]], encoded using the base-58-btc header and alphabet as described in the Multibase section of [[VC-DATA-INTEGRITY]].
{ "@context": [ {"myWebsite": "https://vocabulary.example/myWebsite"}, "https://w3id.org/security/suites/ed25519-2020/v1" ], "myWebsite": "https://hello.world.example/", "proof": { "type": "Ed25519Signature2020", "created": "2020-11-05T19:23:24Z", "verificationMethod": "https://di.example/issuer#z6MkjLrk3gKS2nnkeWcmcxiZPGskmesDpuwRBorgHxUXfxnG", "proofPurpose": "assertionMethod", "proofValue": "z4oey5q2M3XKaxup3tmzN4DRFTLVqpLMweBrSxMY2xHX5XTYVQeVbY8nQAVHMrXFkXJpmEcqdoDwLWxaqA3Q1geV6" } }
The `Ed25519Signature2020` cryptographic suite takes an input document, canonicalizes the document using the RDF Dataset Canonicalization algorithm [[RDF-CANON]], and then cryptographically hashes and signs the output resulting in the production of a data integrity proof. The algorithms in this section also include the verification of such a data integrity proof.
To generate a proof, the algorithm in Section 4.1: Add Proof in the Data Integrity [[VC-DATA-INTEGRITY]] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section [[[#transformation-ed25519signature2020]]], the hashing algorithm is defined in Section [[[#hashing-ed25519signature2020]]], and the proof serialization algorithm is defined in Section [[[#proof-serialization-ed25519signature2020]]].
To verify a proof, the algorithm in Section 4.2: Verify Proof in the Data Integrity [[VC-DATA-INTEGRITY]] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section [[[#transformation-ed25519signature2020]]], the hashing algorithm is defined in Section [[[#hashing-ed25519signature2020]]], and the proof verification algorithm is defined in Section [[[#proof-verification-ed25519signature2020]]].
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section [[[#hashing-ed25519signature2020]]].
Required inputs to this algorithm are an unsecured data document (`unsecuredDocument`) and transformation options (`options`). The transformation options MUST contain a type identifier for the cryptographic suite (`type`) and a cryptosuite identifier (`cryptosuite`). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section [[[#proof-serialization-ed25519signature2020]]] or Section [[[#proof-verification-ed25519signature2020]]].
The required inputs to this algorithm are a transformed data document (`transformedDocument`) and proof configuration (`proofConfig`). The proof configuration MUST contain a type identifier for the cryptographic suite (`type`) and MAY contain a cryptosuite identifier (`cryptosuite`). A single hash data value represented as series of bytes is produced as output.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are proof options (`options`). The proof options MUST contain a type identifier for the cryptographic suite (`type`) and MAY contain a cryptosuite identifier (`cryptosuite`). A proof configuration object is produced as output.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (`hashData`) and proof options (`options`). The proof options MUST contain a type identifier for the cryptographic suite (`type`) and MAY contain a cryptosuite identifier (`cryptosuite`). A single digital proof value represented as series of bytes is produced as output.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (`hashData`), a digital signature (`proofBytes`) and proof options (`options`). A verification result represented as a boolean value is produced as output.
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key and the representation of the private key are shown below.
{ publicKeyMultibase: "z6MkrJVnaZkeFzdQyMZu1cgjg7k1pZZ6pvBQ7XJPt4swbTQ2", secretKeyMultibase: "z3u2en7t5LR2WtQH5PfFqMqwVHBeXouLzo6haApm8XHqvjxq" }
Signing begins with a credential without an attached proof, which is converted to canonical form, and then hashed, as shown in the following three examples.
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the Ed25519 signature, and then base58-btc encode the signature.
Assemble the signed credential with the following two steps:
proofValue
field with the previously computed base58-btc
value to the proof options document.
proof
field of the credential to the augmented proof
option document.
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key are shown below.
{ publicKeyMultibase: "z6MkrJVnaZkeFzdQyMZu1cgjg7k1pZZ6pvBQ7XJPt4swbTQ2", secretKeyMultibase: "z3u2en7t5LR2WtQH5PfFqMqwVHBeXouLzo6haApm8XHqvjxq" }
Signing begins with a credential without an attached proof, which is converted to canonical form, and then hashed, as shown in the following three examples.
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the Ed25519 signature, and then base58-btc encode the signature.
Assemble the signed credential with the following two steps:
proofValue
field with the previously computed base58-btc
value to the proof options document.
proof
field of the credential to the augmented proof
option document.
The signer needs to generate a private/public key pair with the private key used for signing and the public key made available for verification. The representation of the public key, and the representation of the private key, are shown below.
{ publicKeyMultibase: "z6MkrJVnaZkeFzdQyMZu1cgjg7k1pZZ6pvBQ7XJPt4swbTQ2", secretKeyMultibase: "z3u2en7t5LR2WtQH5PfFqMqwVHBeXouLzo6haApm8XHqvjxq" }
Signing begins with a credential without an attached proof, which is converted to canonical form, and then hashed, as shown in the following three examples.
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the Ed25519 signature, and then base58-btc encode the signature.
Assemble the signed credential with the following two steps:
proofValue
field with the previously computed base58-btc
value to the proof options document.
proof
field of the credential to the augmented proof
option document.
Proof sets and chains are defined in the [[VC-DATA-INTEGRITY]]. We provide test vectors showing the creation of proof sets and chains with the `eddsa-rdfc-2022` cryptosuite. Multiple signers can be involved in the generation of proof sets and chains so multiple public/private key pairs are needed. These are shown below.
The original unsigned credential is shown below:
To demonstrate creating a proof set, we start with a document containing a single proof and add another proof to it. The starting document is shown below and contains a proof signed with `keyPair1`.
The `options` input to Section 4.4: Add Proof Set/Chain in [[VC-DATA-INTEGRITY]] is shown below. Note that it does not include a `previousProof` attribute since we are constructing a proof set and not a chain. In addition, we will be using `keyPair2` for signing.
Per the algorithm of Section 4.4: Add Proof Set/Chain in [[VC-DATA-INTEGRITY]], we create an array variable, `allProofs`, and add the proof from the starting document to it. Since there is no `previousProof` attribute, no modification of `unsignedDocument` is needed prior to computing the signed proof in step 6 of Section 4.4: Add Proof Set/Chain in [[VC-DATA-INTEGRITY]]. The signed proof configuration is shown below.
The signed proof `options` above gets appended to the `allProofs` variable, which then gets set as the `proof` attribute of the unsigned document to produce the final signed document as shown below.
This collection of test vectors demonstrates the construction a proof chain. We start with a document containing a proof set, i.e., our previous example, and then add a new proof to the credential that has a dependency on the existing proofs. This example also demonstrates the case where the `previousProofs` attribute is an array. This example uses `keyPair3` and the starting document is given below.
The `options` input to Section 4.4: Add Proof Set/Chain in [[VC-DATA-INTEGRITY]] is shown below. Note that it includes a `previousProof` attribute since we are constructing a proof chain.
Per the algorithm of Section 4.4: Add Proof Set/Chain in [[VC-DATA-INTEGRITY]], we create an array variable, `allProofs`, and add the proofs from the starting document to it. Since the options contains the `previousProof` attribute, we compute the `matchingProofs` variable per step 4 of Section 4.4: Add Proof Set/Chain, and we set the `unsecuredDocument.proof` equal to the `matchingProofs`. This produces the document shown below.
In step 6, we use the previous document (unsecured document with previous proofs added to it) to compute the `proofValue` attribute. This gives the signed configuration options (proof) shown below:
The signed proof `options` above gets appended to the `allProofs` variable, which then gets set as the `proof` attribute of the unsigned document to produce the final signed document as shown below.
This collection of test vectors demonstrates construction of an extended proof chain. We start with the output of the previous section and add an additional proof that is dependent on one of the existing proofs. This example uses `keyPair4`, and the starting document is given below.
The `options` input to Section 4.4: Add Proof Set/Chain in [[VC-DATA-INTEGRITY]] is shown below. Note that it includes a `previousProof` attribute since we are constructing a proof chain, however this time it is a single value.
Per the algorithm of Section 4.4: Add Proof Set/Chain in [[VC-DATA-INTEGRITY]], we create an array variable, `allProofs`, and add the proofs from the starting document to it. Since the `options` contains the `previousProof` attribute, we compute the `matchingProofs` variable per step 4 of Section 4.4: Add Proof Set/Chain, and we set the `unsecuredDocument.proof` equal to the `matchingProofs`. This produces the document shown below.
In step 6, we use the previous document (unsecured document with previous proofs added to it) to compute the `proofValue` attribute. This gives the signed configuration options (proof) shown below:
The signed proof `options` above gets appended to the `allProofs` variable, which then gets set as the `proof` attribute of the unsigned document to produce the final signed document as shown below.
This section contains the substantive changes that have been made to this specification over time.
Changes since the First Candidate Recommendation:
Changes since the First Public Working Draft:
Work on this specification has been supported by the Rebooting the Web of Trust community facilitated by Christopher Allen, Shannon Appelcline, Kiara Robles, Brian Weller, Betty Dhamers, Kaliya Young, Manu Sporny, Drummond Reed, Joe Andrieu, Heather Vescent, Kim Hamilton Duffy, Samantha Chase, Andrew Hughes, Erica Connell, Shigeya Suzuki, and Zaïda Rivai. The participants in the Internet Identity Workshop, facilitated by Phil Windley, Kaliya Young, Doc Searls, and Heidi Nobantu Saul, also supported the refinement of this work through numerous working sessions designed to educate about, debate on, and improve this specification.
The Working Group also thanks our Chair, Brent Zundel, our ex-Chair Kristina Yasuda, as well as our W3C Staff Contact, Ivan Herman, for their expert management and steady guidance of the group through the W3C standardization process.
Portions of the work on this specification have been funded by the United States Department of Homeland Security's Science and Technology Directorate under contracts 70RSAT20T00000029, 70RSAT21T00000016, 70RSAT23T00000005, 70RSAT20T00000010/P00001, 70RSAT20T00000029, 70RSAT21T00000016/P00001, 70RSAT23T00000005, 70RSAT23C00000030, 70RSAT23R00000006, and the National Science Foundation through NSF 22-572. The content of this specification does not necessarily reflect the position or the policy of the U.S. Government and no official endorsement should be inferred.
The Working Group would like to thank the following individuals for reviewing and providing feedback on the specification (in alphabetical order):
Will Abramson, Mahmoud Alkhraishi, Christopher Allen, Joe Andrieu, Bohdan Andriyiv, Anthony, George Aristy, Hadley Beeman, Greg Bernstein, Bob420, Sarven Capadisli, Melvin Carvalho, David Chadwick, Matt Collier, Gabe Cohen, Sebastian Crane, Kyle Den Hartog, Veikko Eeva, Eric Elliott, Raphael Flechtner, Julien Fraichot, Benjamin Goering, Kim Hamilton Duffy, Joseph Heenan, Helge, Ivan Herman, Michael Herman, Anil John, Andrew Jones, Michael B. Jones, Rieks Joosten, Gregory K, Gregg Kellogg, Filip Kolarik, David I. Lehn, Charles E. Lehner, Christine Lemmer-Webber, Eric Lim, Dave Longley, Tobias Looker, Jer Miller, nightpool, Luis Osta, Nate Otto, George J. Padayatti, Addison Phillips, Mike Prorock, Brian Richter, Anders Rundgren, Eugeniu Rusu, Markus Sabadello, silverpill, Wesley Smith, Manu Sporny, Patrick St-Louis, Orie Steele, Henry Story, Oliver Terbu, Ted Thibodeau Jr, John Toohey, Bert Van Nuffelen, Mike Varley, Snorre Lothar von Gohren Edwin, Jeffrey Yasskin, Kristina Yasuda, Benjamin Young, Dmitri Zagidulin, and Brent Zundel.