Cryptography provides a frictionless surface for digital trust in data, devices and users. Cryptography is all about the “magical gears” that convert plaintext to ciphertext and vice versa, using the magic of mathematics (such as initialization vectors, counters, message blocks, key sizes, key algorithms, block chaining, output feedback, XOR operations, group theory, Euler Phi and one-way hash functions). As the famous mathematician and physicist Gauss wrote: "Mathematics is the queen of the sciences, and number theory is the queen of mathematics".
The brownfield and majority of greenfield devices in IoT and operational technologies are not manufactured to be cyber resilient and are therefore untrustworthy platforms in cyberspace. Where does one begin the digital transformation journey to build digital trust on top of an untrusted platform?
There are significantly more IoT devices than people on this planet today (forecast to reach a 3:1 ratio by 2030). IoT devices require standards-based key and certificate management to enable use of cryptographic artifacts for lights-out safe and secure field operations. Use of cryptography is at the core of application and data security by design. Cryptography is the Achilles heel of cybersecurity because it does not discriminate against sophisticated hackers (think ransomware and bricking devices in critical infrastructure). So how do you play God?
Digital certificates may be used to accomplish a variety of purposes, from authentication to data integrity and confidentiality.
Cryptographic keys come in two forms: asymmetric and symmetric.
Asymmetric public-private key pairs are used for key encapsulation required by key distribution and key exchange ceremonies. A key exchange algorithm is symmetric to the initiator and the responder (i.e., both perform the same set of operations). A key encapsulation algorithm requires a different set of operations to be performed by the initiator and responder.
The data-in-transit may be protected for confidentiality using temporal symmetric keys (e.g., AES). A symmetric key (serving as a shared secret) is used for data integrity verification (e.g., HMAC) and for protecting data-at-rest (DAR). Keys are long-lived or short-lived, trusted (third party broker) or trustless (no brokers).
The type and strength of key to use for a specific purpose requires weighing the pros and cons associated with key management and post-quantum threats on the horizon.
Devices speak silently. Device analytics requires credible device intelligence and risk models. Devices designed for longevity of lifetime in the field require on-board instrumentation for continuous device platform monitoring, data function telemetry, remote management, maintenance and recovery (like satellites and voyager spacecrafts in outer space). Data-driven artificial intelligence, machine learning and deep learning rely on trustworthy data for timely and cost-effective risk mitigation. Digital signing of device intelligence and telemetry is critical for scalable and trustworthy analytics in the cloud. The qualitative and quantitative metrics require policy based “clean” data mining to avert a tsunami of big data and the economic impact of “dirty” data.
For users, OpenID Connect (OIDC) and SAML are used for authentication. The OAuth protocol is (independently) used for access authorization. HTTP basic authentication is based on a username and password. XML SAML enabled single sign on (SSO) for enterprise applications and services. Modern web applications benefit from the use of the OAuth protocol, background HTTP API calls (by single page applications), and JSON Web Tokens (JWT).
For (headless) devices, keys are the keys to the kingdom (no pun intended) and certificates are the statement of authoritative identity for authentication and authorization. The tools and methods (protocols) for provisioning keys and certificates for device and platform attestation at the factory by the device manufacturer and provisioning operational keys and certificates for line of business applications by the device owners may vary significantly (for a good number of reasons).
Key management is at the epicenter of cryptographic operations. The stages in the lifecycle of a key includes the generation, distribution, usage and renewal (or rotation) of the key. Certificates merely associate an identity (i.e., subject name) to the public key that establishes authoritative ownership with proof of possession of the associated private key. Identity may be established through other methods (e.g., via tokens) besides requiring the use of certificates. Key management poses the real scalability and maintenance challenge at endpoints (user and user-less devices) in both consumer and enterprise environments. PKI buildout requires complex and expensive investments in certificate authorities, hardware security modules and secure elements. The economics and agility of key management must therefore be at the forefront of technology selection.
In 1994, Peter Shor developed a quantum computer algorithm to find the prime factors of a semiprime (a number that is the product of two large prime numbers). Using such an approach, future quantum attacks could break public key cryptography algorithms (e.g., RSA, DH, ECDH). A 2048-bit RSA key provides inadequate security against quantum attacks. A recent study by MIT showed that a 2048-bit RSA key could potentially be cracked by a powerful quantum computer in eight hours. In 1996, Lov Kumar Grover developed a quantum computer algorithm that finds with high probability the unique input to a black box function that produces a particular output value. Symmetric key lengths must be increased to protect against future quantum computing attacks. AES 128 provides inadequate security against quantum attacks. AES 192 and AES 256 are considered to be safe for a very long time until quantum computers become affordable. In 2022, NIST approved lattice-based cryptography for IoT in a quantum world. However, the impact of increased key sizes, stack usage and execution cycles on resource constrained devices requires further analysis. Vulnerability to side channel attacks is yet another area that requires technical assessments.