Cryptography usage in Web Standards

Editor’s Draft,

More details about this document
This version:
https://w3c.github.io/security-guidelines-cryptography
Issue Tracking:
GitHub
Editor:
(Invited Expert)

Abstract

This document serves as a clear, practical, and standards-compliant guide for recognized cryptographic algorithms and their appropriate applications in various scenarios. It provides recommendations on the selection and implementation of specific algorithms, parameter configurations, and common mistakes to avoid, aiming to foster a uniform, standards-driven, and secure application of cryptography in web technologies.

The intended audience includes developers of web specifications and application creators, with the objective of encouraging interoperable, maintainable, and verifiable cryptographic practices in web standards and implementations.

Status of this document

1. Introduction

Several Web standards depend on cryptographic primitives to reach security properties such as confidentiality, integrity, authenticity, and non-repudiation. Selecting appropriate algorithms and secure parameterizations is often non-trivial for developers. The number of available algorithms, the modes of operations, key sizes, evolving standards and deprecation schedules make it difficult to identify secure and interoperable solutions. Incorrect choices or unsafe parameter combinations can lead to interoperability failures and serious vulnerabilities.

This document provides a concise, practical and standards-aligned reference for standardized cryptographic algorithms and their recommended usage in different contexts. It offers guidelines on when and how to use specific algorithms, parameter choices, and common pitfalls to avoid, in order to promote a consistent, standards-based, and secure use of cryptography across web technologies.

The guidance targets implementers of web specifications and application developers, with the goal of promoting interoperable, maintainable, and auditable cryptographic deployments in web standards and implementations.

1.1. Terminology

According to the United Nations Human Rights Council report [A-HRC-53-42], the term "standard" refers to an agreed norm defining a way of doing something in a repeatable manner. Technical standards constitute a form of codified technical knowledge that enables the development of products and processes. While standards cover a broad range of products, services, processes and activities, for the purposes of the report, standards will refer to technical standards pertaining to new and emerging digital technologies.

"Cryptography" is the practice and study of techniques for securing communication and information by transforming it into a format that is unreadable to unauthorized users. It encompasses various methods, including encryption, decryption, hashing, and digital signatures, to ensure confidentiality, integrity, authenticity, and non-repudiation of data [NIST-SP-800-57]. "Cryptoanalysis" is the process of analyzing cryptographic systems to find weaknesses or vulnerabilities that can be exploited to break the encryption or compromise the security of the system [NIST-SP-800-57].

This document focuses on Cryptography and on its usage in Web Standards.

2. Standards

2.1. Standard Development Organizations

A standards organization, also know as standards body, standards developing organization (SDO), or standards setting organization (SSO), is an entity dedicated to the creation and improvement of technical standards. Its main role involves developing, coordinating and updating standards to ensure they remain relevant and practical for those who use them.

The crypographic SDO landscape is vast and varied, with a large constellation of actors. Standardization processes exist within national, regional and international organizations [A-HRC-53-42].

Among the largest and oldest SDOs, International standards organizations include the International Organization for Standardization (ISO), the International Electrotechnical Commission (IEC), and the International Telecommunication Union (ITU) are international standards that develop standards for a vast field of digital technologies and applications. In the field of cryptography, ISO/IEC JTC 1/SC 27 (Information security, cybersecurity and privacy protection) is the main committee responsible for developing international standards, and ITU-T (International Telecommunication Union - Telecommunication Standardization Sector) defines security and cryptography standards for global telecommunications.

Nationals criptographic standards bodies, such as the National Institute of Standard and Technology (NIST) in the United States, the Agence nationale de la sécurité des systèmes d’information (ANSSI) in France, the Bundesamt für Sicherheit in der Informationstechnik (BSI) in Germany, develop standards that are specific to their respective countries. These standards often align with international standards but may also address local needs and requirements, such as the Chinese State Cryptography Administration (SCA) which develops and publishes cryptographic standards that are specific to China, ensuring that they meet the country’s security and regulatory requirements.

Regional standards organizations, such as the European Committee for Standardization (CEN) and the European Telecommunications Standards Institute (ETSI), develop standards that are applicable within specific regions. These standards help harmonize technical requirements across member countries, promoting regional integration and cooperation.

Most national and regional standards-developing organizations (SDOs) are members of the International Organization for Standardization (ISO) and participate in its working groups to develop international standards, thereby promoting coherence and interoperability between regional, national, and international standards.

Although this draft emphasizes international standard-setting, many of its insights and recommendations extend beyond that scope, offering relevance for regional and national contexts as well.

In addition, standards can also be developed by industry consortia, professional associations, and other non-governmental organizations. Generally, many of these have specific focus areas and governance models, are industry-driven or academia-driven and operate on specific focus areas and governance models with processes that are open to the general public. Examples of such organizations include the World Wide Web Consortium (W3C), the Internet Engineering Task Force (IETF), and the Institute of Electrical and Electronics Engineers (IEEE). These organizations often focus on specific technologies or industries and develop standards that address the unique needs and challenges of those areas. W3C focuses on the application layer standards for the World Wide Web, while IETF is the main standardization body that define protocols and standards for the internet, working closely with its sister organization, the Internet Research Task Force, which focuses on long-term research related to Internet protocols, applications, architecture and technology. The IEEE produces standards that underpin telecommunications, information technology, consumer electronics, wireless communications and power-generation products and services.

Among these several standardization bodies it is important to distinguish between ones that standardize cryptographic algorithms and protocols and ones that provide guidelines for implementing cryptographic mechanisms. Some SDOs focus on defining the algorithms and protocols themselves, while others provide recommendations and best practices for their implementation in various applications. For example, W3C provides standards and guidelines for implementing cryptographic mechanisms in web technologies but does not standardize new cryptographic algorithms. W3C’s standards, such as the Web Cryptography API, define how cryptographic operations can be performed in web applications, ensuring secure communication and data protection on the web. One of the most recognized standardization bodies for cryptographic primitives is NIST, the U.S. federal agency that develops and promotes measurement, standards, and technology. NIST has developed several widely adopted cryptographic standards, including the Advanced Encryption Standard (AES), Secure Hash Algorithm (SHA) family and Digital Signature Algorithm (DSA). The principles that guide NIST’s cryptographic standards are described in NIST IR 7977. IETF (Internet Engineering Task Force) publishes RFCs (Requests For Comments) that define protocols and standards for the internet, including cryptographic protocols such as Transport Layer Security (TLS) and Secure/Multipurpose Internet Mail Extensions (S/MIME).

Collaboration between SDOs is common to ensure interoperability and consistency in cryptographic standards and protocols across different domains and applications. W3C provide standard that often rely on IETF protocols, or NIST provide guidelines and recommendations for cryptographic algorithms used in IETF protocols.

2.2. Standard documentation

It is important to note that cryptographic standards are continuously evolving to address emerging threats and advancements in technology. Therefore, it is crucial to stay updated with the latest standards and guidelines from recognized organizations to ensure the use of secure and reliable cryptographic mechanisms.

3. Security services

Cryptography may be used to provide several basic security services: confidentiality, integrity, data authenticity and non-repudation. In general, a single cryptographic mechanism may provide more than one service (e.g., the use of digital signatures can provide non-repudation and authenticity) but not all services. And a single security services might require multiple cryptographic mechanisms to be combined (e.g., to be sure that confidentiality is achieved an encryption algorithm combined with a key management system is required).

3.1. Confidentiality

Confidentiality ensures that information is protected from being disclosed to unauthorized parties. It is typically achieved through encryption, which transforms readable data into an unreadable data using a cryptographic key. Only authorized parties that know the correct key can decrypt and access the original information. The most used cryptographic algorithms for ensuring confidentialityare symmetric encryption algorithms, such as AES (Advanced Encryption Standard).

3.2. Integrity

Integrity ensures that data remains unchanged and unaltered during transmission or storage. It is typically achieved through hashing algorithms. If the data is modified, the hash value will change, indicating that the integrity of the data has been compromised. Integrity is essential for ensuring that information remains accurate, preventing unauthorized modifications. The most used cryptographic algorithms for ensuring integrity are hash functions, such as SHA-256 (Secure Hash Algorithm 256-bit).

3.3. Data Authenticity

Data authenticity ensures that the source of a message or data is a legit source. It can be achieved through Message Authentication Codes (MACs), such as HMAC, which verify that the message comes from a legitimate sender and has not been tampered with during transmission.

It is a common misconception that encryption alone provides authenticity, as the ciphertext is unintelligible to unauthorized parties. However, this is not the case; without explicit authenticity mechanisms, an attacker may intercept and modify the ciphertext without being detected, potentially influencing the decrypted result. Therefore, it is essential to use cryptographic mechanisms specifically designed to provide authenticity, such as MACs or digital signatures.

Note: It is important to distinguish between data authenticity and user/system authentication. Authentication is crucial for establishing trust in communications, ensuring that the sender is who they claim to be.

3.4. Non-repudiation

Non-repudiation ensures that a sender cannot deny data authorship. It is typically achieved through digital signatures, which provide a verifiable proof that a specific sender have sent a specific message or performed a specific action, as the sender cannot deny having signed the data. This is particularly important in scenarios such as electronic transactions, where parties need assurance that actions taken cannot be denied later.

Note: Non-repudiation implies data authenticity, but data authenticity does not imply non-repudiation. Indeed, digital signatures provide non-repudiation and data authenticity, while MACs provide only data authenticity.

Note: Data authenticity implies data integrity, but data integrity does not imply data authenticity. Indeed, data integrity can be achieved through hash functions, which do not provide data authenticity, while data authenticity is achieved through MACs or digital signatures, which also provide data integrity.

3.5. Authentication

Authentication verifies the identity of a user, device, or system. It ensures that the entity claiming to be a specific user or system is indeed who they claim to be. Authentication is crucial for establishing trust in online interactions, ensuring that only authorized users can access resources or perform actions.

Note: User/system authentication is different from data authenticity. User/system authentication verifies the identity of a user, while data authenticity ensures that the source of a message or data is a legit source. User/system authentication can be seen as public key authenticity.

This can be mutual or unilateral (e.g., when just the serve authenticates to the user and not viceversa). To achieve user authentication, various methods can be employed, including something the user knows (e.g., password), something the user has (e.g., security token), or something the user is (e.g., biometric data). When the authentication is part of the cryptographic protocol, it is often achieved through digital certificates, cryptographic mechanisms that verify the identity of the entity with the use of the digital signatures. Authentication can be achieved through various means, such as PKI (Public Key Infrastructure) or Web of Trust concept. In the latter case, the trust is decentralized and the keys are certified by other users. Web of Trust is used in some protocols, such as PGP, GnuGPG. On the contrary, through PKI, the trust is centralized and the keys are certified by authority entities, named Certificate Authorities (CAs). It is used in the Web, when the connection to a website is secured through HTTPS, which relies on TLS (Transport Layer Security) protocol. In this case, the server presents a digital certificate to the client, which verifies the authenticity of the server and establishes a secure connection. In other words, in the PKI, authentication provides connection between public keys and user identities. The public key is stored in a digital certificate issued by a Certificate Authority (or CA). The CA owns a certificate that garantee its identity, signed by another CA. This describes a CA gerarchy until the root CA. The standard is X.509, described in [RFC5280], which covers Digital Certificates and Revocation Lists. X.509 defines the format of digital certificates and the mechanisms for their management, including issuance, revocation, and validation.

4. Cryptographic keys

Cryptographic keys are essential components of cryptographic systems as security relies fundamentally on that concept. A cryptographic key is a piece of information used to encrypt and decrypt data, generate digital signatures, and perform other cryptographic operations. Unlike the algorithm, which is often standardized and publicly known, the key must remain secret. This principle is rooted in Kerckhoffs’s principle, formulated in the 19th century, which states that a cryptographic system should remain secure even if everything about it, except the key, is public knowledge. The implication is that the entire robustness of a cryptosystem depends on the secrecy and unpredictability of the key. A weak or poorly generated key undermines security, regardless of the strength of the algorithm itself. Therefore, keys must be generated using secure random number generators to guarantee sufficient entropy and avoid predictability. Furthermore, keys must be protected throughout their lifecycle; indeed, a strong key generation and rigorous key management are essential components of cryptographic security, since any compromise of the key directly compromises the security of the encrypted data. In particular, the strength and security of a cryptographic system depend significantly on the length and complexity of the keys used.

4.1. Key length recommendations

Key length is a critical factor in determining the security of cryptographic algorithms. Longer keys generally provide stronger security, as they increase the complexity of brute-force attacks. However, in some cryptographic algorithms longer keys also require more computational resources for encryption and decryption processes. Therefore, it is essential to balance security and performance when selecting key lengths for cryptographic algorithms. NIST provides guidelines for key lengths in its publications, such as [NIST-SP-800-57], which outlines recommendations for key management and cryptographic algorithms. These guidelines specify minimum key lengths for various algorithms to ensure adequate security levels. For example, NIST recommends a minimum key length of 128 bits for symmetric encryption algorithms like AES and a minimum key length of 2048 bits for RSA public key encryption. It is important to note that key length recommendations may evolve over time as new cryptographic attacks are discovered and computational capabilities increase. Therefore, it is crucial to stay updated with the latest standards and guidelines from recognized organizations such as NIST, ISO, and ETSI to ensure the use of secure key lengths in cryptographic implementations.

4.2. Key management

Key management is a critical aspect of cryptography that involves the generation, distribution, storage, rotation, and revocation of cryptographic keys. Key management best practices are collected in [NIST-SP-800-57], which describes recommendations for the entire lifecycle of cryptographic keys. In the same documentation, NIST introduces the concept of Cryptoperiod which is the time span during which a specific key is authorized for use. The length of the cryptoperiod depends on several factors, including the sensitivity of the information being protected, the strength of the cryptographic algorithm, and the potential threats to the system. NIST provides guidelines for determining appropriate cryptoperiods based on these factors.

5. Crypto agility

Crypto agility allows organizations to adapt their cryptographic systems to maintain security and compliance with industry standards and regulations. Crypto agility refers to the ability of a cryptographic system to quickly and easily switch between different cryptographic algorithms or protocols in response to changing security requirements or threats. This capability is essential in today’s rapidly evolving threat landscape, where new vulnerabilities and attacks are constantly emerging.

6. Post-quantum cryptography

Post-quantum cryptography refers to cryptographic algorithms that are designed to be secure against attacks from quantum computers. Quantum computers have the theoretical potential to break many of the widely used cryptographic algorithms, such as RSA and ECC, which rely on the difficulty of certain mathematical problems (discrete logarithm and integer factorization) that can be efficiently solved by quantum algorithms like Shor’s algorithm.

To address this threat, it is needed to develope new cryptographic algorithms that are resistant to quantum attacks. These algorithms are based on mathematical problems that are believed to be hard for both classical and quantum computers to solve, and define new cryptography branches such as lattice-based cryptography, code-based cryptography, multivariate polynomial cryptography, and hash-based cryptography.

NIST is currently in the process of standardizing post-quantum cryptographic algorithms through a multi-round competition. The goal is to identify and standardize algorithms that can provide strong security against quantum attacks while also being efficient and practical for real-world applications. The selected algorithms will be used to replace or supplement existing cryptographic algorithms (in hybrid solutions) in various applications, including digital signatures and key exchange protocols.

7. Cryptography usage

This section provides an overview of the standardized cryptographic algorithms divided by primitives and their recommended usage in different scenarios.

7.1. Symmetric encryption

Symmetric encryption uses the same key for both encryption and decryption processes. This means that both the sender and the receiver must know the same key, which must be kept secret to ensure the security of the encrypted data. Symmetric encryption provides confidentiality, ensuring that only the parties which have the knowledge of the correct key can access the original information. It is generally faster and more efficient than asymmetric encryption, making it suitable for encrypting large amounts of data.

The symmetric encryption algorithm standard is AES (Advanced Encryption Standard), standardized in [FIPS-197]. It is a block cipher that operates on fixed-size blocks of data (128 bits) and supports key sizes of 128, 192, and 256 bits.

The mode of operation is an essential aspect of symmetric encryption algorithms, as it determines how the algorithm processes data and how it handles different scenarios, such as data of varying lengths or the need for additional security features. The mode of operation can significantly impact the security and performance of the encryption process. The mode of operation is not part of the AES algorithm itself, but rather a separate specification that defines how to use the AES algorithm to encrypt and decrypt data. Recommentaions are provided in [NIST-SP-800-38A].

One of the most recommended modes of operation for AES is AES-CTR (AES in Counter mode). It is widely used for its efficiency. It is described in [NIST-SP-800-38A].

Since the length of the plaintext is not necessarly multiple of the block length,padding techniques are used to add extra data to the plaintext to ensure that its size aligns with the block size required by block cipher algorithms. Padding is necessary because block ciphers operate on fixed-size blocks of data, and if the plaintext does not align with these blocks, it cannot be processed correctly. However, it is important to note that padding can introduce vulnerabilities if not implemented correctly according to the specs of the algorithm. Two reccommended padding schemes are PKCS#7 standardized in [RFC5652] and Bit padding standardized in [ISO-IEC-9797-1].

Note: Symmetric encryption alone provides just confidentiality, but does not provide data authenticity or integrity. Indeed, an attacker may intercept and modify the ciphertext without detection, potentially influencing the decrypted result. Therefore, the only use of symmetric encryption algorithms with no authentication mechanism is generally discouraged and it is preferable to use cryptographic mechanisms specifically designed to provide also authenticity and integrity.

To provide confidentiality with authenticity it is reccommended the usage of a combination between symmetric encryption (providing confidentiality) and MACs (providiina authenticity and integrity). It is reccommended the usage of AES-CTR and HMAC (Encrypt-then-MAC) or the usage of a single authenticated encryption with associated data (AEAD) scheme such as AES-GCM.

7.1.1. Authenticated encryption

Authenticated encryption with associated data (AEAD) schemes provide confidentiality, integrity, and authenticity in a single operation. They combine encryption and authentication mechanisms to ensure that the data remains confidential while also verifying its integrity and authenticity.

The most recommended AEAD algorithm is AES-GCM, AES with the mode of operation said Galois/Counter Mode, standardized by NIST in [NIST-SP-800-38D] and by IETF in [RFC5116] [RFC5288].

Another common AEAD algorithm is ChaCha20-Poly1305 described in [RFC8439]. It is not a standard defined from IETF but it is considered standacrd de facto.

7.1.2. Modes for Key wrapping

Key wrapping is a technique used to securely encapsulate cryptographic keys for safe storage or transmission. It involves encrypting the key using a symmetric encryption algorithm, such as AES, to protect it from unauthorized access. The wrapped key can then be safely stored or transmitted, and only authorized parties with the correct decryption key can unwrap and access the original key.

It is commonly used in scenarios where keys need to be securely exchanged between parties or stored in a secure manner, such as in hardware security modules (HSMs) or key management systems (KMS).

The modes of operation used for key wrapping are standardized in [NIST-SP-800-38F].

The most recommended key wrapping algorithm is AES-KW (AES Key Wrap), also described in [RFC3394]. Although KW can be used in conjunction with any reversible padding scheme, a variant of KW with an internal padding scheme is also specified to promote interoperability. This variant is called AES-KWP (AES Key Wrap With Padding), also defined in [RFC5649].

7.1.3. Modes for Storage Devices

When encrypting data on storage devices, it is essential to use modes of operation that provide confidentiality for the protected data. The most recommended mode for encrypting data on storage devices is XTS-AES (XEX-based Tweaked CodeBook mode with ciphertext Stealing), stanrdized in [IEEE-1619-2007] and [NIST-SP-800-38E].

Note: Authentication is not provided, because XTS-AES has been designed to provide encryption without data expansion.

7.1.4. Key length recommendations for symmetric encryption

NIST provides guidelines for key lengths in its publications, such as [NIST-SP-800-57], which outlines recommendations for key management and cryptographic algorithms. These guidelines specify minimum key lengths for various algorithms to ensure adequate security levels.

In the symmetric scenario, although NIST recommends a minimum key length of 128 bits for symmetric encryption algorithms like AES, there is no strong reason to use AES with 128 or 192 bits key length, instead of AES with 256 bits key length, which provides a higher level of security. Therefore, it is generally recommended to use AES with 256 bits key length for symmetric encryption.

Key with length minore than 128 bits are considered weak and not recommended for secure applications.

Some algorithms are designed to use keys of fixed length (256 bits in the case of ChaCha20).

It is important to note that keys with 256 bits length are considered secure against brute-force attacks even in a post-quantum scenario.

7.2. Asymmetric encryption

In PKCS#1 v2.2 [RFC8017] two encryption schemes constructed by RSA algorithm are specified: RSAES-OAEP and RSAES-PKCS1-v1_5. RSAES-OAEP is required to be supported for new applications. Instead, RSAES-PKCS1-v1_5 is quite still used, its use is not reccommended as it has to be intended for legacy applications only, and it is included only for compatibility with already existing applications.

Note: The usage of asymmetric encryption is generally discouraged for data encryption, and it is preferable to use symmetric encryption to encrypt data with an asymmetric algorithm to exchange the symmetric key.

7.3. Key exchange

The main objective of key exchange is to establish a shared secret between two parties over an insecure channel. The parties involved are provided with a pair of keys: a public key, which can be shared with anyone, and a private key, which must be kept secret. The public key of each party is published or exchanged, while the private key remains confidential. The shared secret is derived using the private key of one party and the public key of the other party. This process ensures that only the two parties involved can compute the shared secret, as it requires knowledge of the private key.

The current reccommended key exchange algorithms are ECDH (Elliptic Curve Diffie-Hellman) and the post-quantum key exchange schemes ML-KEM amd HQC. The most used today is ECDH, descrived in [RFC6090]. The main aspect of ECDH is the choice of the elliptic curve used and the most widely used and recommended curves are the NIST curves P-256 (with 128 security bits), P-384, P-521 (with 256 security bits) also noted as Secp256r1, Secp384r1, Secp521r1 respectively and standardized in [FIPS-186-5].

Other common curves most used are the Montgommery curves Curve2559 and Curve448. ECDH with Curve2559 is named X25519 and ECDH with Curve448 is named X448. They are not a standard by NIST, but are widely used and recommended for their security and performance. They are defined in [RFC7748]. The difference betweem them is the security level and performance, with X25519 being faster and more efficient (128 security bits), while X448 offers a higher security level (224 security bits).

Note: The output of a key exchange generally is not uniformly distributed, therefore it is descouraged using that as cryptographic key. Instead, a KDF is required to derive a symmetric key from the shared secret.

An important aspect is that public key of the counterparty must be validated before using it in the key exchange process to ensure its authenticity and integrity. This validation process typically involves checking the format of the public key, verifying its parameters, and ensuring that it has not been tampered with or altered. Failure to validate the public key can lead to security vulnerabilities, such as small subgroup attack or invalid curve attack, which can compromise the security of the key exchange process and potentially expose sensitive information to unauthorized parties.

Note: Be always sure that the implemented algorithm used validates the public key of the counterparty before using it in the key exchange process.

7.3.1. Post-quantum key exchange algorithms

The post-quantum key exchange scheme standardized by NIST through the PQ competition process is ML-KEM (Module Lattice-based Key Encapsulation Mechanism). ML-KEM, standardized in [FIPS-203], is based on CRYSTALS-KYBER scheme and is a lattice-based key encapsulation mechanism that offers a good balance between security and performance. It is based on the hardness of the Learning With Errors (LWE) problem and is designed to be efficient in both software and hardware implementations. ML-KEM is one of the most widely adopted post-quantum key exchange schemes and is recommended for use in various applications.

At the end of the forth round of the PQC competition, NIST selected an additional algorithm for standardization: HQC. It is not yet standardized, but the FIPS is coming soon. HQC is a code-based key encapsulation mechanism that offers strong security guarantees against quantum attacks. However, HQC has larger key sizes and slower performance compared to lattice-based schemes like ML-KEM. As a result, it may not be as widely adopted for applications that require high-performance cryptographic operations.

Note: An important aspect of key exchange algorithms is the authentication of the public keys exchanged between the parties. Without proper authentication, an attacker could perform a MitM (Man-in-the-Middle) attack.

7.4. Hash functions

The main objective of a hash function is to provide data integrity. A hash function takes an input (or message) and returns a fixed-size string of bytes. The output, typically a digest, is unique to each unique input. Even a small change in the input will produce a significantly different output, making hash functions useful for verifying data integrity.

The current most used hash functions algorithm is SHA-2, standardized in [FIPS-180-4]. It has four versions depending on the digest legth: SHA-224, SHA-256, SHA-384, SHA-512.

Another hash function standardized by NIST is SHA-3, specified in [FIPS-202]. SHA-3 provides stronger security guarantees than SHA-2, but is generally slower in software implementations. Like its predecessor, SHA-3 is available in four versions based on the digest length: SHA3-224, SHA3-256, SHA3-384, and SHA3-512.

Another hash family is BLAKE2 (BLAKE2s, BLAKE2b), which is not a standard but is widely used and recommended for its high performance and security. BLAKE2 is faster than MD5, SHA-1, and SHA-2, while providing a similar level of security. It is available in two main variants: BLAKE2b, optimized for 64-bit platforms and suitable for applications requiring high security levels (digest sizes up to 512 bits), and BLAKE2s, optimized for 8- to 32-bit platforms and suitable for applications with constrained resources (digest sizes up to 256 bits).

In applications where variable output length is required, the extendable-output functions (XOFs) is used. The XOFs standardized by NIST are SHAKE and cSHAKE [FIPS-202] [NIST-SP-800-185].

7.4.1. Password hashing

When hash function are used in order to store passwords, it is advisable to use password hash ad hoc. For example, Argon2, defined in [RFC9106], is a memory-hard password hashing algorithm that is designed to be resistant to brute-force attacks and GPU-based attacks. It is the winner of the Password Hashing Competition (PHC) and is widely recommended for secure password storage.

7.5. Digital signatures

The digital signature is created using the sender’s private key, which is kept secret, and can be verified using the sender’s public key, which is widely distributed.

The main objective of digital signatures is providing no-repudiation. Indeed, a digital signature is a cryptographic mechanism that allows a sender to sign a message or document in such a way that the sender cannot deny having signed the message, as the receiver can cryptographically prove the association between the message and the sender. In addition, through digital signatures the recipient can verify the authenticity and the integrity of the message.

It is important to note that by using MACs the authenticity of the message can be verified, but the sender can always deny having sent the message, as both the sender and the receiver share the same secret key used to generate and verify the MAC. Therefore, MACs do not provide non-repudiation. Moreover, the authenticity provided by MACs is limited to the parties that share the secret key, while digital signatures provide authenticity that can be verified by anyone with access to the sender’s public key.

The current recommended digital signatures are ECDSA, EdDSA, the post-quantum signature schemes ML-DSA, SLH-DSA and Falcon and for compatibility RSA-PSS.

One of the most used digital signatures is ECDSA (Elliptic Curve Digital Signature Algorithm), which is standardized in [FIPS-186-5] and ISO 14888-3, ANSI X9.62, IEEE P1363, etc. One of the most important aspects of ECDSA is the choice of the elliptic curve used. There are several options, but the most widely used and recommended curves are the NIST curves P-256, P-384, P-521 also noted as Secp256r1, Secp384r1, Secp521r1 respectively and standardized in [FIPS-186-5]. Another common curve is Secp256k1, defined in SEC (Standard for Efficient Cryptography) 2, which is used in Bitcoin and other cryptocurrencies, but also in TLS and SSH. ---da verificare: However, it is not recommended by NIST due to concerns about its security and the lack of a clear rationale for its selection.

The digital signature algorithm EdDSA is defined in [RFC8032] and standardized in [FIPS-186-5]. It is gaining popularity as an alternative to ECDSA due to its performance in software and security properties. EdDSA has two parametrizations: Ed25519 with Edward Curve25519 and Ed448 with Edward curve Curve448. Both are designed by Daniel J. Bernstein, are not standardized by NIST but are widely used and recommended for their security and performance. They are defined in [RFC8032] and [RFC7748]. The difference betweem them is the security level and performance, with Ed25519 being faster and more efficient (128 security bits), while Ed448 offers a higher security level (224 security bits).

Both X25519 key exchange and Ed25519 digital signature use Curve25519 as their base. But, one uses Curve25519 as a Montgomery Curve (X25519), and the other uses a twisted Edwards Curve (Ed25519).

In the applications where the digital signature is used in addition o encryption, it is a best practice sign the message before encryptiong it, rather than the contrary.

Note: Sign before encrypting, not the viceversa.

In some applications, it is possible that the same elliptic curves are used for different aims, such as key exchange algorithm and digital signatur algoritm. It is a bad practice use the same key pair.

Note: (domain separation) Use different key pairs for key exchange and digital signature, even if they are based on the same elliptic curve.

In scenarios where the use of the RSA algorithm is preferred, RSA-PSS, standardized in [RFC8017], is recommended for new applications and is required to be supported. Instead, the use of RSAES-PKCS1-v1_5, standardized in [RFC8017], is discouraged for new applications and its use is limited to legacy applications only, but despite this, it is still included in some applications and documentation, e.g., web crypto API. No attacks are known against RSASSA-PKCS1-v1_5, but there exists some insecure implementations and in the interest of increased robustness, RSASSA-PSS is recommended for eventual adoption in new applications. RSASSA-PKCS1-v1_5 is included for compatibility with existing applications, and while still appropriate for new applications, a gradual transition to RSASSA-PSS is encouraged by IETF.

7.5.1. Post-quantum digital signature algorithms

The post-quantum digital signature schemes standardized by NIST through the PQ competition process are ML-DSA, SLH-DSA. ML-DSA, standardized in [FIPS-204], is based on CRYSTALS-DILITHIUM scheme and is a lattice-based signature scheme that offers a good balance between security and performance. It is based on the hardness of the Learning With Errors (LWE) problem and is designed to be efficient in both software and hardware implementations. ML-DSA is one of the most widely adopted post-quantum signature schemes and is recommended for use in various applications.

SLH-DSA, standardized in [FIPS-205], is based on SPHINCS+ scheme. It is a stateless hash-based signature scheme that provides strong security guarantees against quantum attacks. However, SPHINCS+ has larger signature sizes and slower performance compared to lattice-based schemes like ML-DSA. As a result, it may not be as widely adopted for applications that require high-performance cryptographic operations.

Another selected post-quantum digital signature scheme is Falcon, which is a lattice-based signature scheme that offers a good balance between security and performance. It is based on the hardness of the NTRU problem and is designed to be efficient in both software and hardware implementations. Falcon is also one of the most widely adopted post-quantum signature schemes and is recommended for use in various applications. Due to some implementation issues, its standard is still in draft status.

7.6. Message authentication codes (MACs)

The main objective of a MAC is to provide data authenticity and integrity. A MAC is a short piece of information, typically a fixed-size string of bytes, that is generated using a secret key and a message. The MAC is appended to the message and sent to the recipient, who can then use the same secret key to generate a MAC for the received message and compare it to the received MAC. If the two MACs match, it indicates that the message has not been altered and comes from a legitimate sender.

The current reccommended MACs are HMAC, KMAC256 and Keyed BLAKE2.

One of the most used MAC is HMAC (Hash-based Message Authentication Code), which is standardized in [FIPS-198-1] and [RFC2104]. HMAC can be used with any underlying hash function, such as SHA-256 or SHA-512. It is widely used in various cryptographic protocols and applications, including TLS (Transport Layer Security) and IPsec (Internet Protocol Security), to ensure data authenticity and integrity.

Another MAC standardized by NIST is KMAC (Keccak Message Authentication Code), which is defined in [NIST-SP-800-185]. KMAC is based on the Keccak hash function, which is also the basis for the SHA-3 family of hash functions. KMAC can be used to derive keys of variable lengths and provides strong security guarantees. The most used variant is KMAC256.

There exist other MACs that are not standardized, Keyed BLAKE2b-256 and Keyed BLAKE2b-512. They are based on the BLAKE2 hash function, which is known for its high performance and security.

The length of the MAC tag is an important factor in determining the security of the MAC. Longer tags provide stronger security, as they increase the complexity of brute-force attacks (or birthday attacks). However, longer tags also require more computational resources for generation and verification processes. Therefore, it is essential to balance security and performance when selecting the length of the MAC tag. In general, the recommended tag length for MACs is at least 128 bits to provide adequate security against brute-force attacks, while 256 bits tag length provides a higher level of security for applications that require stronger protection. In general, it is always reccommended to use a hash function with digest at least 256 bits long to avoid collision attacks. For HMAC, the recommended tag length is at least 160 bits when using SHA-1 as the underlying hash function, and at least 256 bits when using SHA-256 or SHA-512. For KMAC256, the recommended tag length is at least 256 bits. For Keyed BLAKE2, the recommended tag length is at least 128 bits for Keyed BLAKE2b-256 and at least 256 bits for Keyed BLAKE2b-512. In general, MACs with 64 bits tag length are considered weak and not recommended for secure applications.

7.7. Key derivation functions (KDFs)

Key derivation functions (KDFs) are cryptographic algorithms that derive one or more keys uniformly distributed from a single source key not uniform. The source key is often referred to as the "master key" or "input keying material" (IKM), while the derived keys are called "output keying material" (OKM). The source key may be the result of key exchenge protocols or hardware random number generators. It must have sufficient entropy to ensure the security of the derived keys. The derived keys should be indistinguishable from random keys and should not reveal any information about the source key.One of the most used KDF is HKDF (HMAC-based Key Derivation Function), which is standardized in [RFC5869]. HKDF is based on the HMAC (Hash-based Message Authentication Code) construction and can be used with any underlying hash function, such as SHA-256 or SHA-512. HKDF is widely used in various cryptographic protocols and applications, including TLS 1.3, to derive session keys from a shared secret. Another KDF standardized by NIST is KMAC (Keccak Message Authentication Code), which is defined in [NIST-SP-800-185]. KMAC is based on the Keccak hash function, which is also the basis for the SHA-3 family of hash functions. KMAC can be used to derive keys of variable lengths and provides strong security guarantees. The most used variant is KMAC256. BLAKE2 also supports a keyed mode, which can be used as a KDF. Keyed BLAKE2b-256 and Keyed BLAKE2b-512 are the most used variants.

7.7.1. Password-based key derivation functions (PBKDFs)

Password-based key derivation functions (PBKDFs) are cryptographic algorithms that derive one or more keys from a password or passphrase which tipycally have low entropy. They are designed to be computationally intensive and resistant to brute-force attacks, making it difficult for attackers to guess the password and derive the keys.

The most reccommended PBKDF is Argon2id, described in [RFC9106], which is a memory-hard function that provides strong security guarantees against various types of attacks. Argon2id is designed to be efficient in both software and hardware implementations and is widely used in password hashing and key derivation applications.

PBKDF2 is standardized in [RFC8018]. It is based on the HMAC construction and can be used with any underlying hash function, such as SHA-256 or SHA-512.

8. Acknowledgment

References

Informative References

[A-HRC-53-42]
United Nations Human Rights Council. Human Rights and Technical Standard-Setting Processes for New and Emerging Digital Technologies. 2023. URL: https://undocs.org/en/A/HRC/53/42
[FIPS-180-4]
National Institute of Standards and Technology. Secure Hash Standard (SHS). URL: https://csrc.nist.gov/publications/detail/fips/180/4/final
[FIPS-186-5]
National Institute of Standards and Technology. Digital Signature Standard (DSS). URL: https://csrc.nist.gov/pubs/fips/186-5/final
[FIPS-197]
National Institute of Standards and Technology. Advanced Encryption Standard (AES). URL: https://csrc.nist.gov/publications/detail/fips/197/final
[FIPS-198-1]
National Institute of Standards and Technology. The Keyed-Hash Message Authentication Code (HMAC). URL: https://csrc.nist.gov/publications/detail/fips/198/1/final
[FIPS-202]
National Institute of Standards and Technology. SHA-3 Standard: Permutation-Based Hash and Extendable-Output Functions. URL: https://csrc.nist.gov/publications/detail/fips/202/final
[FIPS-203]
National Institute of Standards and Technology. Module-Lattice-Based Key-Encapsulation Mechanism Standard. August 2024. URL: https://csrc.nist.gov/pubs/fips/203/final
[FIPS-204]
National Institute of Standards and Technology. Module-Lattice-Based Digital Signature Standard. August 2024. URL: https://csrc.nist.gov/pubs/fips/204/final
[FIPS-205]
National Institute of Standards and Technology. Stateless Hash-Based Digital Signature Standard. August 2024. URL: https://csrc.nist.gov/pubs/fips/205/final
[IEEE-1619-2007]
Institute of Electrical and Electronics Engineers. IEEE Standard for Cryptographic Protection of Data on Block-Oriented Storage Devices. URL: https://standards.ieee.org/standard/1619-2007.html
[ISO-IEC-9797-1]
International Organization for Standardization. Information technology — Security techniques — Message Authentication Codes (MACs) — Part 1: Mechanisms using a block cipher. URL: https://www.iso.org/standard/50375.html
[NIST-SP-800-185]
National Institute of Standards and Technology. SHA-3 Derived Functions: cSHAKE, KMAC, TupleHash, and ParallelHash. URL: https://csrc.nist.gov/publications/detail/sp/800-185/final
[NIST-SP-800-38A]
National Institute of Standards and Technology. Recommendation for Block Cipher Modes of Operation: Methods and Techniques. URL: https://csrc.nist.gov/publications/detail/sp/800-38a/final
[NIST-SP-800-38D]
National Institute of Standards and Technology. Recommendation for Block Cipher Modes of Operation: Galois/Counter Mode (GCM) and GMAC. URL: https://csrc.nist.gov/publications/detail/sp/800-38d/final
[NIST-SP-800-38E]
National Institute of Standards and Technology. Recommendation for Block Cipher Modes of Operation: The XTS-AES Mode for Confidentiality on Storage Devices. URL: https://csrc.nist.gov/publications/detail/sp/800-38e/final
[NIST-SP-800-38F]
National Institute of Standards and Technology. Recommendation for Block Cipher Modes of Operation: Methods for Key Wrapping. URL: https://csrc.nist.gov/publications/detail/sp/800-38f/final
[NIST-SP-800-57]
National Institute of Standards and Technology. Recommendation for Key Management Part 1: General. URL: https://csrc.nist.gov/pubs/sp/800/57/pt1/r5/final
[RFC2104]
Hugo Krawczyk; Mihir Bellare; Ran Canetti. HMAC: Keyed-Hashing for Message Authentication. February 1997. URL: https://datatracker.ietf.org/doc/html/rfc2104
[RFC3394]
J. Schaad; R. Housley. Advanced Encryption Standard (AES) Key Wrap Algorithm. September 2002. URL: https://datatracker.ietf.org/doc/html/rfc3394
[RFC5116]
H. Krawczyk; T. Dierks. An Interface and Algorithms for Authenticated Encryption. January 2008. URL: https://datatracker.ietf.org/doc/html/rfc5116
[RFC5280]
S. Santesson; et al. Internet X.509 Public Key Infrastructure Certificate and Certificate Revocation List (CRL) Profile. May 2008. URL: https://datatracker.ietf.org/doc/html/rfc5280
[RFC5288]
T. Dierks; E. Rescorla. AES Galois Counter Mode (GCM) Cipher Suites for Transport Layer Security (TLS). August 2008. URL: https://datatracker.ietf.org/doc/html/rfc5288
[RFC5649]
J. Schaad; M. Dworkin. AES Key Wrap with Padding Algorithm. August 2009. URL: https://datatracker.ietf.org/doc/html/rfc5649
[RFC5652]
R. Housley. Cryptographic Message Syntax (CMS). September 2009. Internet Standard. URL: https://www.rfc-editor.org/rfc/rfc5652
[RFC5869]
H. Krawczyk; P. Eronen. HMAC-based Extract-and-Expand Key Derivation Function (HKDF). May 2010. URL: https://datatracker.ietf.org/doc/html/rfc5869
[RFC6090]
Darrel Hankerson; Alfred Menezes; Scott Vanstone. Fundamental Elliptic Curve Cryptography Algorithms. February 2011. URL: https://datatracker.ietf.org/doc/html/rfc6090
[RFC7748]
Daniel J. Bernstein; Tanja Lange. Elliptic Curves for Security. March 2016. URL: https://datatracker.ietf.org/doc/html/rfc7748
[RFC8017]
Moriarty, K., Ed.; et al. Public-Key Cryptography Standards (PKCS) #1: RSA Cryptography Specifications Version 2.2. November 2016. URL: https://datatracker.ietf.org/doc/html/rfc8017
[RFC8018]
K. Moriarty, Ed.; B. Kaliski; A. Rusch. PKCS #5: Password-Based Cryptography Specification Version 2.1. January 2017. URL: https://datatracker.ietf.org/doc/html/rfc8018
[RFC8032]
Daniel J. Bernstein; et al. Edwards-Curve Digital Signature Algorithm (EdDSA). January 2017. URL: https://datatracker.ietf.org/doc/html/rfc8032
[RFC8439]
Y. Nir; A. Langley. ChaCha20 and Poly1305 for IETF Protocols. June 2018. URL: https://datatracker.ietf.org/doc/html/rfc8439
[RFC9106]
A. Biryukov; et al. Argon2 Memory-Hard Function for Password Hashing and Proof-of-Work Applications. September 2021. URL: https://datatracker.ietf.org/doc/html/rfc9106