This specification describes a Data Integrity cryptographic suite for use when creating or verifying a digital signature using the twisted Edwards Curve Digital Signature Algorithm (EdDSA) and Curve25519 (ed25519).
This is an experimental specification and is undergoing regular revisions. It is not fit for production deployment.
This specification defines a cryptographic suite for the purpose of creating, verifying proofs for Ed25519 EdDSA signatures in conformance with the Data Integrity [[VC-DATA-INTEGRITY]] specification. The approach is accepted by the U.S. National Institute of Standards in the latest FIPS 186-5 publication and meets U.S. Federal Information Processing requirements when using cryptography to secure digital information.
The suites described in this specification use the RDF Dataset Normalization Algorithm [[RDF-CANON]] or the JSON Canonicalization Scheme [[RFC8785]] to transform an input document into its canonical form. The canonical representation is then hashed and signed with a detached signature algorithm.
A conforming proof is any concrete expression of the data model that complies with the normative statements in this specification. Specifically, all relevant normative statements in Sections and of this document MUST be enforced.
A conforming processor is any algorithm realized as software and/or hardware that generates or consumes a conforming proof. Conforming processors MUST produce errors when non-conforming documents are consumed.
This document contains examples of JSON and JSON-LD data. Some of these examples are invalid JSON, as they include features such as inline comments (`//`) explaining certain portions and ellipses (`...`) indicating the omission of information that is irrelevant to the example. These parts would have to be removed in order to treat the examples as valid JSON or JSON-LD.
The following sections outline the data model that is used by this specification to express verification methods, such as cryptographic public keys, and data integrity proofs, such as digital signatures.
This cryptographic suite is used to verify Data Integrity Proofs [[VC-DATA-INTEGRITY]] produced using Edwards Curve cryptographic key material. The encoding formats for those key types are provided in this section. Lossless cryptographic key transformation processes that result in equivalent cryptographic key material MAY be used for the processing of digital signatures.
The Multikey format, defined in [[VC-DATA-INTEGRITY]], is used to express public keys for the cryptographic suites defined in this specification.
The `publicKeyMultibase` property represents a Multibase-encoded Multikey expression of an Ed25519 public key. The value starts with the two-byte prefix `0xed01`, followed by the 32-byte Ed25519 public key data. The combined 34-byte value is then encoded using base58-btc (`z`) as the prefix.
Developers are advised to not accidentally publish a representation of a private key. Implementations of this specification will raise errors if they encounter a prefix value other than `0xed01` in a `publicKeyMultibase` value.
{ "id": "https://example.com/issuer/123#key-0", "type": "Multikey", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "z6Mkf5rGMoatrSj1f4CyvuHBeXJELe9RPdzo2PKGNCKVtZxP" }
{ "@context": [ "https://www.w3.org/ns/did/v1", "https://w3id.org/security/data-integrity/v1" ], "id": "did:example:123", "verificationMethod": [{ "id": "did:example:123#key-0", "type": "Multikey", "controller": "did:example:123", "publicKeyMultibase": "z6Mkf5rGMoatrSj1f4CyvuHBeXJELe9RPdzo2PKGNCKVtZxP" }], "authentication": [ "did:example:123#key-0" ], "assertionMethod": [ "did:example:123#key-0" ], "capabilityDelegation": [ "did:example:123#key-0" ], "capabilityInvocation": [ "did:example:123#key-0" ] }
The `secretKeyMultibase` property represents a Multibase-encoded Multikey expression of an Ed25519 secret key (sometimes also referred to as a private key). The value starts with the two-byte prefix `0x8026` (the varint expression of `0x1300`), followed by the 32-byte Ed25519 secret key data. The combined 34-byte value is then base58-btc encoded and `z` is added as the prefix on the encoded value.
Developers are advised to prevent accidental publication of a representation of a secret key, and to not export the `secretKeyMultibase` property by default, when serializing key pairs to Multikey.
This suite relies on detached digital signatures represented using [[MULTIBASE]] and [[?MULTICODEC]].
The `verificationMethod` property of the proof MUST be a URL. Dereferencing the `verificationMethod` MUST result in an object containing a `type` property with the value set to `Multikey`.
The `type` property of the proof MUST be `DataIntegrityProof`.
The `cryptosuite` property of the proof MUST be `eddsa-rdfc-2022` or `eddsa-jcs-2022`.
The `created` property of the proof MUST be an [[XMLSCHEMA11-2]] formatted date string.
The `proofPurpose` property of the proof MUST be a string, and MUST match the verification relationship expressed by the verification method `controller`.
The `proofValue` property of the proof MUST be a detached EdDSA produced according to [[RFC8032]], encoded according to [[MULTIBASE]] using the base58-btc base encoding.
{ "@context": [ {"title": "https://schema.org/title"}, "https://www.w3.org/ns/credentials/v2" ], "title": "Hello world!", "proof": { "type": "DataIntegrityProof", "cryptosuite": "eddsa-rdfc-2022", "created": "2023-02-24T23:36:38Z", "verificationMethod": "https://vc.example/issuers/5678#z6MkrJVnaZkeFzdQyMZu1 cgjg7k1pZZ6pvBQ7XJPt4swbTQ2", "proofPurpose": "assertionMethod", "proofValue": "z5C5b1uzYJN6pDR3aWgAqUMoSB1JY29epA74qyjaie9qh4okm9DZP6y77eTNq 5NfYyMwNu9bpQQWUHKH5zAmEtszK" } }
The following section describes multiple Data Integrity cryptographic suites that utilize the twisted Edwards Curve Digital Signature Algorithm.
The `eddsa-rdfc-2022` cryptographic suite takes an input document, canonicalizes the document using the Universal RDF Dataset Canonicalization Algorithm [[RDF-CANON]], and then cryptographically hashes and signs the output resulting in the production of a data integrity proof. The algorithms in this section also include the verification of such a data integrity proof.
When the RDF Dataset Canonicalization Algorithm [[RDF-CANON]] is used, implementations will detect dataset poisoning by default, and abort processing upon such detection.
To generate a proof, the algorithm in Section 4.1: Add Proof in the Data Integrity [[VC-DATA-INTEGRITY]] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section , the hashing algorithm is defined in Section , and the proof serialization algorithm is defined in Section .
To verify a proof, the algorithm in Section 4.2: Verify Proof in the Data Integrity [[VC-DATA-INTEGRITY]] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section , the hashing algorithm is defined in Section , and the proof verification algorithm is defined in Section .
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section .
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type) and a cryptosuite identifier (cryptosuite). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section or Section .
The required inputs to this algorithm are a transformed data document (transformedDocument) and canonical proof configuration (canonicalProofConfig). A single hash data value represented as series of bytes is produced as output.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData), a digital signature (proofBytes) and proof options (options). A verification result represented as a boolean value is produced as output.
The `eddsa-jcs-2022` cryptographic suite takes an input document, canonicalizes the document using the JSON Canonicalization Scheme [[RFC8785]], and then cryptographically hashes and signs the output resulting in the production of a data integrity proof. The algorithms for this cryptographic suite are the same as the ones in Section except for the following modifications:
In Section , step 1) and step 2) are replaced by the following text:
In Section , step 8) is not performed, and steps 4) and 9) are replaced by the following text:
4) If options.type is not set to `DataIntegrityProof` and proofConfig.cryptosuite is not set to `eddsa-jcs-2022`, an `INVALID_PROOF_CONFIGURATION` error MUST be raised.
9) Let canonicalProofConfig be the result of applying the JSON Canonicalization Scheme [[RFC8785]] to the proofConfig.
Before reading this section, readers are urged to familiarize themselves with general security advice provided in the Security Considerations section of the Data Integrity specification.
The following section describes security considerations that developers implementing this specification should be aware of in order to create secure software.
This specification relies on URDNA2015, please review [[RDF-CANON]].
This specification relies on [[MULTIBASE]], [[?MULTICODEC]] and [[RFC8032]].
Ed25519 signatures (EdDSA algorithm with edwards25519 curve) have been widely adopted, due both to the compact size of the keys and signatures and to the speed at which signatures can be produced and verified. Many libraries exist that can create and verify Ed25519 signatures. Since the publication of [[RFC8032]], security properties of Ed25519 signatures have been rigorously proven (see [[Provable_Ed25519]] and [[Taming_EdDSAs]]). However, it has been observed that a significant number of libraries do not achieve these security levels, due to missing input validity checks during the signature verification process. In this section, we summarize the security levels achievable with Ed25519 signatures, and indicate how one can determine whether a library will support those levels.
Digital signatures may exhibit a number of desirable cryptographic properties [[Taming_EdDSAs]] among these are:
EUF-CMA (existential unforgeability under chosen message attacks) is usually the minimal security property required of a signature scheme. It guarantees that any efficient adversary who has the public key of the signer and received an arbitrary number of signatures on messages of its choice (in an adaptive manner): , cannot output a valid signature for a new message (except with negligible probability). In case the attacker outputs a valid signature on a new message: , it is called an existential forgery.
SUF-CMA (strong unforgeability under chosen message attacks) is a stronger notion than EUF-CMA. It guarantees that for any efficient adversary who has the public key of the signer and received an arbitrary number of signatures on messages of its choice: , it cannot output a new valid signature pair , such that (except with negligible probability). Strong unforgeability implies that an adversary cannot only sign new messages, but also cannot find a new signature on an old message. See [[Provable_Ed25519]] for a real world attack that would have been circumvented with SUF-CMA security over EUF-CMA security.
Binding signature (BS) We say that a signature scheme is binding if no efficient signer can output a tuple , where both and are valid message signature pairs under the public key and (except with negligible probability). A binding signature makes it impossible for the signer to claim later that it has signed a different message, the signature binds the signer to the message.
Strongly Binding signature (SBS) Certain applications may require a signature to not only be binding to the message but also be binding to the public key. We say that a signature scheme is strongly-binding if any efficient signer can not output a tuple , where is a valid signature for the public key and is a valid signature for the public key and either or , or both (except with negligible probability). See [[Provable_Ed25519]] for real world attacks that would have been circumvented with the SBS property.
Note that the BS and SBS properties are forms of non-repudiation.
As pointed on in [[Taming_EdDSAs]] flaws in Ed25519 libraries primarily occur on the signature verification side where sometimes edge cases are not properly checked. An Ed25519 signature library that is in conformance with [[RFC8032]] or [[FIPS-186-5]], i.e., one that performs all specified validation checks, will have the SUF-CMA property in addition to EUF-CMA.
Reference [[Taming_EdDSAs]] achieves the BS and SBS properties along with SUF-CMA in their "signature verification algorithm 2" where an additional check is performed against the public key A to make sure that it is not one of eight "small order points". These additional checks incur minimal processing overhead.
Reference [[Taming_EdDSAs]] included a set of twelve test vectors to test various Ed25519 libraries available at the time of publication. They found that a significant portion missed edge cases and hence did not achieve SUF-CMA (just EUF-CMA) and only two libraries out of sixteen achieved all the security properties. Since the time of publication more Ed25519 libraries have been created and some of the libraries have been updated to include all verification checks. Implementers are recommended to test the Ed25519 library they are using against the test vectors of [[Taming_EdDSAs]].
Before reading this section, readers are urged to familiarize themselves with general privacy advice provided in the Privacy Considerations section of the Data Integrity specification.
The following section describes privacy considerations that developers implementing this specification should be aware of in order to avoid violating privacy assumptions.
This cryptography suite does not provide for selective disclosure or unlinkability. If signatures are re-used, they can be used as correlatable data.
`Ed25519Signature2020` is an earlier version of a cryptographic suite for the usage of the EdDSA algorithm and Curve25519. While it has been used in production systems, new implementations should use `edssa-2022` instead. It has been kept in this specification to provide a stable reference.
We need to add documentation to note that this key format is deployed and widely used in production, but is deprecated. `Multikey` and `JsonWebKey2020` supersede it.
The `type` of the verification method MUST be Ed25519VerificationKey2020.
The `controller` of the verification method MUST be a URL.
The `publicKeyMultibase` property of the verification method MUST be a public key encoded according to [[?MULTICODEC]] and formatted according to [[MULTIBASE]]. The multicodec encoding of an Ed25519 public key is the two-byte prefix `0xed01` followed by the 32-byte public key data. The 34 byte value is then encoded using base58-btc (`z`) as the prefix. Any other encoding MUST NOT be allowed.
Developers are advised to not accidentally publish a representation of a private key. Implementations of this specification will raise errors in the event of a [[?MULTICODEC]] value other than `0xed01` being used in a `publicKeyMultibase` value.
{ "id": "https://example.com/issuer/123#key-0", "type": "Ed25519VerificationKey2020", "controller": "https://example.com/issuer/123", "publicKeyMultibase": "z6Mkf5rGMoatrSj1f4CyvuHBeXJELe9RPdzo2PKGNCKVtZxP" }
{ "@context": [ "https://www.w3.org/ns/did/v1", "https://w3id.org/security/suites/ed25519-2020/v1" ], "id": "did:example:123", "verificationMethod": [{ "id": "did:example:123#key-0", "type": "Ed25519VerificationKey2020", "controller": "did:example:123", "publicKeyMultibase": "z6Mkf5rGMoatrSj1f4CyvuHBeXJELe9RPdzo2PKGNCKVtZxP" }], "authentication": [ "did:example:123#key-0" ], "assertionMethod": [ "did:example:123#key-0" ], "capabilityDelegation": [ "did:example:123#key-0" ], "capabilityInvocation": [ "did:example:123#key-0" ] }
The `verificationMethod` property of the proof MUST be a URL. Dereferencing the `verificationMethod` MUST result in an object containing a `type` property with the value set to `Ed25519VerificationKey2020`.
The `type` property of the proof MUST be `Ed25519Signature2020`.
The `created` property of the proof MUST be an [[XMLSCHEMA11-2]] formatted date string.
The `proofPurpose` property of the proof MUST be a string, and MUST match the verification relationship expressed by the verification method `controller`.
The `proofValue` property of the proof MUST be a detached EdDSA produced according to [[RFC8032]], encoded according to [[MULTIBASE]] using the base58-btc base encoding.
{ "@context": [ {"title": "https://schema.org/title"}, "https://w3id.org/security/data-integrity/v1" ], "title": "Hello world!", "proof": { "type": "Ed25519Signature2020", "created": "2020-11-05T19:23:24Z", "verificationMethod": "https://di.example/issuer#z6MkjLrk3gKS2nnkeWcmcxiZPGskmesDpuwRBorgHxUXfxnG", "proofPurpose": "assertionMethod", "proofValue": "z4oey5q2M3XKaxup3tmzN4DRFTLVqpLMweBrSxMY2xHX5XTYVQeVbY8nQAVHMrXFkXJpmEcqdoDwLWxaqA3Q1geV6" } }
The `Ed25519Signature2020` cryptographic suite takes an input document, canonicalizes the document using the Universal RDF Dataset Canonicalization Algorithm [[RDF-CANON]], and then cryptographically hashes and signs the output resulting in the production of a data integrity proof. The algorithms in this section also include the verification of such a data integrity proof.
To generate a proof, the algorithm in Section 4.1: Add Proof in the Data Integrity [[VC-DATA-INTEGRITY]] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section , the hashing algorithm is defined in Section , and the proof serialization algorithm is defined in Section .
To verify a proof, the algorithm in Section 4.2: Verify Proof in the Data Integrity [[VC-DATA-INTEGRITY]] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section , the hashing algorithm is defined in Section , and the proof verification algorithm is defined in Section .
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section .
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type) and a cryptosuite identifier (cryptosuite). A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section or Section .
The required inputs to this algorithm are a transformed data document (transformedDocument) and proof configuration (proofConfig). The proof configuration MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single hash data value represented as series of bytes is produced as output.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
The following algorithm specifies how to serialize a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
The following algorithm specifies how to verify a digital signature from a set of cryptographic hash data. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [[VC-DATA-INTEGRITY]] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData), a digital signature (proofBytes) and proof options (options). A verification result represented as a boolean value is produced as output.
The signer needs to generate a private/public key pair with the private key used
for signing and the public key made available for verification. The
[[MULTIBASE]]/[[?MULTICODEC]] representation for the public key, ed25519-pub
,
and the representation for the private key, ed25519-priv
, are shown below.
{ publicKeyMultibase: "z6MkrJVnaZkeFzdQyMZu1cgjg7k1pZZ6pvBQ7XJPt4swbTQ2", privateKeyMultibase: "z3u2en7t5LR2WtQH5PfFqMqwVHBeXouLzo6haApm8XHqvjxq" }
Signing begins with a credential without an attached proof, which is converted to canonical form, and then hashed, as shown in the following three examples.
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the Ed25519 signature, and then base58-btc encode the signature.
Assemble the signed credential with the following two steps:
proofValue
field with the previously computed base58-btc
value to the proof options document.
proof
field of the credential to the augmented proof
option document.
The signer needs to generate a private/public key pair with the private key used
for signing and the public key made available for verification. The
[[MULTIBASE]]/[[?MULTICODEC]] representation for the public key, ed25519-pub
,
and the representation for the private key, ed25519-priv
, are shown below.
{ publicKeyMultibase: "z6MkrJVnaZkeFzdQyMZu1cgjg7k1pZZ6pvBQ7XJPt4swbTQ2", privateKeyMultibase: "z3u2en7t5LR2WtQH5PfFqMqwVHBeXouLzo6haApm8XHqvjxq" }
Signing begins with a credential without an attached proof, which is converted to canonical form, and then hashed, as shown in the following three examples.
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the Ed25519 signature, and then base58-btc encode the signature.
Assemble the signed credential with the following two steps:
proofValue
field with the previously computed base58-btc
value to the proof options document.
proof
field of the credential to the augmented proof
option document.
The signer needs to generate a private/public key pair with the private key used
for signing and the public key made available for verification. The
[[MULTIBASE]]/[[?MULTICODEC]] representation for the public key, ed25519-pub
,
and the representation for the private key, ed25519-priv
, are shown below.
{ publicKeyMultibase: "z6MkrJVnaZkeFzdQyMZu1cgjg7k1pZZ6pvBQ7XJPt4swbTQ2", privateKeyMultibase: "z3u2en7t5LR2WtQH5PfFqMqwVHBeXouLzo6haApm8XHqvjxq" }
Signing begins with a credential without an attached proof, which is converted to canonical form, and then hashed, as shown in the following three examples.
The next step is to take the proof options document, convert it to canonical form, and obtain its hash, as shown in the next three examples.
Finally, we concatenate the hash of the proof options followed by the hash of the credential without proof, use the private key with the combined hash to compute the Ed25519 signature, and then base58-btc encode the signature.
Assemble the signed credential with the following two steps:
proofValue
field with the previously computed base58-btc
value to the proof options document.
proof
field of the credential to the augmented proof
option document.
This section contains the substantive changes that have been made to this specification over time.
Changes since the First Public Working Draft: