Damus
Max Hillebrand profile picture
Max Hillebrand

The Praxeology of Privacy ~ Chapter 13: Cryptographic Foundations

Article header image

Cryptography provides mathematical privacy foundations: encryption, hashing, and digital signatures enable trustless verification. Implementation bugs and human error remain the weakest links.

#The Praxeology of Privacy

"We can build systems that permit anonymous communication without trusted intermediaries."

Eric Hughes^1^

Introduction

The goal of this chapter is conceptual understanding, not a comprehensive cryptography course. Readers need to understand what cryptographic tools accomplish and why they work, not how to implement them. Implementation requires specialized expertise; using implementations requires understanding their properties.

Cryptography solves coordination problems through mathematics, not institutions. Where traditional systems require trusting intermediaries, cryptographic systems require trusting only computational hardness assumptions. This shift from institutional to mathematical trust is the foundation for privacy-preserving technology.

13.1 Symmetric and Asymmetric Cryptography

Symmetric Cryptography

Symmetric cryptography uses the same key for encryption and decryption. Alice encrypts a message with a secret key; Bob decrypts with the same key. If only Alice and Bob possess the key, only they can read the message.

Symmetric encryption is fast and efficient. Modern symmetric algorithms (AES, ChaCha20) can encrypt data at gigabytes per second. For bulk data encryption, symmetric cryptography remains the practical choice.

The problem is key distribution. How do Alice and Bob establish a shared secret key without meeting in person? If they communicate the key over an insecure channel, an eavesdropper intercepts it. If they need a secure channel to exchange the key, they already have secure communication and do not need the key.

For millennia, this chicken-and-egg problem limited cryptography to parties who could physically exchange keys: diplomats with couriers, military with secure channels, spies with dead drops. Mass adoption of encryption required solving key distribution.

Asymmetric Cryptography

Asymmetric (public-key) cryptography, discovered in the 1970s, solved key distribution.^2^ Instead of one shared key, each party generates a mathematically related key pair: a public key they can share openly and a private key they keep secret.

The key properties distinguish asymmetric from symmetric schemes. Anyone can encrypt a message using the recipient's public key, but only the recipient's private key can decrypt it. The mathematical relationship between keys is non-reversible: computing the public key from the private key is straightforward, but computing the private key from the public key is computationally infeasible. This means knowing the public key does not reveal the private key. Most significantly, no prior relationship is required; Alice can send Bob an encrypted message having never communicated with him before, using only his publicly available public key.

This solves key distribution. Alice publishes her public key. Anyone can encrypt messages to Alice. Only Alice can decrypt them. No secure channel is needed to establish the relationship.

What Each Approach Solves

Symmetric cryptography solves the problem of efficient bulk encryption when parties already share a secret. Asymmetric cryptography solves the problem of establishing secure communication without prior shared secrets.

In practice, systems use both. Asymmetric cryptography establishes a session key; symmetric cryptography encrypts the actual data. This hybrid approach combines asymmetric's key distribution solution with symmetric's efficiency.

The Algorithms

Several foundational algorithms enable asymmetric cryptography. Diffie-Hellman key exchange, published in 1976, allows two parties to establish a shared secret over a public channel.^3^ Neither party reveals their private key, but both derive the same shared secret through mathematical operations on public values. Diffie-Hellman solves key exchange but not encryption directly; the shared secret it produces typically becomes the key for symmetric encryption.

RSA, published in 1977, provides both encryption and digital signatures using the difficulty of factoring large prime numbers.^4^ Security depends on factorization remaining computationally infeasible for sufficiently large numbers. RSA can encrypt messages directly (up to a size limit) and create signatures. Its disadvantage is key size: secure RSA requires keys of 2048 bits or more, making it slower and more resource-intensive than alternatives.

Elliptic Curve Cryptography (ECC), developed in 1985, achieves equivalent security with smaller keys using different mathematical structures.^5^ A 256-bit elliptic curve key provides security comparable to a 3072-bit RSA key. The smaller keys make ECC faster and more suitable for constrained devices. Bitcoin uses the secp256k1 elliptic curve for its signatures. Most modern systems prefer ECC over RSA for new implementations.

In practice, these algorithms serve complementary roles. Diffie-Hellman (or its elliptic curve variant ECDH) establishes shared secrets; RSA or elliptic curve signatures authenticate parties; and the resulting shared secrets key symmetric ciphers like AES for bulk encryption.

The Role of Randomness

Cryptographic security depends on unpredictability. Keys must be randomly generated; if an attacker can guess or predict a key, the strongest algorithm provides no protection.

Entropy measures unpredictability. A 256-bit key has 256 bits of entropy only if each bit is equally likely to be 0 or 1, independent of all other bits. If the key generation process has bias or patterns, the effective entropy is lower than the bit length suggests, and the key is weaker than it appears.

Randomness in cryptography must be cryptographically secure: not merely "random looking" but unpredictable to any adversary. A pseudorandom number generator (PRNG) is an algorithm that uses a small initial value called a seed to produce a long sequence of numbers that appear random but are actually deterministic. PRNGs that produce statistically random output may still be predictable if an attacker knows the internal state or seed value. Cryptographically secure pseudorandom number generators (CSPRNGs) are designed so that even observing their output does not reveal future values.

Sources of entropy include hardware random number generators that sample physical phenomena (thermal noise, radioactive decay, electronic noise) and system events (keystroke timing, mouse movements, network packet arrival times). Since any single source might be compromised or insufficient, secure systems concatenate multiple independent entropy sources. An attacker who can predict one source still cannot predict the combined output if other sources remain unpredictable. Operating systems maintain entropy pools that accumulate randomness from all available sources and feed CSPRNGs that applications use for key generation.

When randomness fails, cryptography fails completely. The algorithms may be sound, but predictable keys are guessable keys.

13.2 Hash Functions and Digital Signatures

One-Way Functions

A cryptographic hash function takes input of any size and produces a fixed-size output (the "hash" or "digest"). SHA-256, widely used in Bitcoin and elsewhere, produces a 256-bit output regardless of input size.^6^

Hash functions exhibit several key properties. They are deterministic: the same input always produces the same output. They are one-way: given the output, finding any input that produces it is computationally infeasible. They are collision-resistant: finding two different inputs that produce the same output is computationally infeasible. They exhibit the avalanche effect: small changes in input produce dramatically different outputs.

Hash functions enable efficient integrity verification. Instead of comparing entire files, compare their hashes. If hashes match, files match (with overwhelming probability). If hashes differ, files differ.

Digital Signatures

Digital signatures use asymmetric cryptography to provide authentication and integrity.^7^ Unlike encryption (where anyone with the public key encrypts and only the private key holder decrypts), signatures work in the opposite direction: only the private key holder can create a signature, but anyone with the public key can verify it.

The signing process begins by computing the hash of the document, creating a fixed-size digest of the content. The signer then applies the signature algorithm using their private key and the hash, producing a signature value that accompanies the document.

Verification reverses this process. The verifier independently hashes the document, then applies the verification algorithm using the public key, the signature, and the recomputed hash. The algorithm outputs valid or invalid.

The mathematics varies by scheme. RSA signatures involve modular exponentiation. ECDSA (used in Bitcoin) involves elliptic curve point multiplication and modular arithmetic. Schnorr signatures use a different construction with useful algebraic properties. What they share is the core asymmetry: creating a valid signature requires the private key; verifying requires only the public key.

Signatures prove three things. Authentication: only someone with the private key could have created the signature, so if you trust the public key belongs to Alice, the signature proves Alice signed. Integrity: any modification to the document after signing invalidates the signature because the recomputed hash will not match. Non-repudiation: Alice cannot credibly deny having signed if the signature validates against her public key.

Trustless Verification

Digital signatures enable verification without trusting the verifier. Anyone with the public key can independently verify. No authority needs to confirm. No intermediary can falsely claim verification.

This is trustless in a specific sense: you need not trust the verification process because you can do it yourself. You still must trust that the public key belongs to whom it claims to belong, but that is a different problem (addressed below).

13.3 Trust: Mathematical vs. Institutional

Traditional Trust Models

Before cryptography, trust required institutions. Reputation allowed parties to build track records over time, though new entrants faced high barriers. Legal enforcement punished breach of agreements, but effectiveness depended on jurisdiction and resources. Trusted third parties served as intermediaries who vouched for unknown parties, concentrating trust in those intermediaries. Physical security through vaults, guards, and sealed documents provided tangible protection.

Each model has failure modes. Reputation can be manufactured. Enforcement requires access to legal systems. Intermediaries can be corrupted or coerced. Physical security can be breached.

Mathematical Trust

Cryptographic trust rests on computational hardness assumptions. The factorization assumption holds that factoring the product of two large primes is computationally infeasible. The discrete logarithm assumption holds that computing discrete logarithms in certain groups is computationally infeasible. Hash function assumptions hold that finding collisions or preimages for properly designed hash functions is computationally infeasible. These assumptions have been studied for decades by mathematicians and cryptographers worldwide. Unlike institutional trust, they do not vary with personnel changes, political pressures, or economic incentives. Mathematics does not accept bribes.

Why Mathematics Is More Reliable

Mathematical trust has properties institutional trust lacks. It offers consistency: the same proof verifies the same way everywhere, and a valid signature in one country is valid in all countries. It offers transparency: the algorithms are public, anyone can verify the mathematics, and security does not depend on secrecy of method. It offers independence: verification requires no third party, and Alice can verify Bob's signature without asking anyone's permission or trusting any intermediary. It offers scalability: computational verification scales with hardware, while human verification does not.

Limits of Mathematical Trust

Mathematical trust is not unlimited. Mathematics cannot tell you whether a public key belongs to whom it claims; that requires some external verification such as meeting in person, a web of trust, or certificate authorities. The mathematics may be sound while the implementation is flawed, and software bugs can undermine theoretically perfect cryptography. Computational assumptions themselves could fail if P=NP or if quantum computers mature sufficiently. Users can be tricked into revealing keys, using compromised software, or trusting wrong public keys.

Mathematical trust replaces some trust requirements but not all. It shifts trust from institutions to assumptions, from humans to mathematics, from reputation to verification. The shift is valuable but not absolute.

13.4 Limitations and Vulnerabilities

Implementation Bugs vs. Cryptographic Breaks

Cryptographic algorithms are rarely broken mathematically. What fails is implementation.

Buffer overflows allow attackers to overwrite memory, potentially extracting keys. Timing attacks measure how long operations take to reveal information about keys. Random number failures compromise security because cryptography requires unpredictable randomness. Protocol errors occur when individual algorithms are secure but their combination is not. Most real-world cryptographic failures are implementation failures. The mathematics holds; the code does not.

Side-Channel Attacks

Side-channel attacks extract information from physical implementation, not mathematical weakness. Power analysis measures power consumption during cryptographic operations to reveal key bits. Electromagnetic emanations from computing equipment can leak information via radio signals. Cache timing attacks observe cache behavior to reveal memory access patterns correlated with keys. Acoustic attacks analyze sound produced by computers to leak cryptographic information. These attacks require physical proximity or sophisticated equipment but demonstrate that cryptographic security depends on more than algorithm strength.

The Human Element

Humans are the weakest link. Social engineering that convinces people to reveal keys or install malware bypasses cryptography entirely. Encryption protected by weak passwords provides weak protection. Key management presents persistent challenges: lost keys mean lost data, and compromised keys mean compromised data. Systems that are hard to use correctly are used incorrectly, and users disable security features that interfere with tasks.

Physical coercion, the "$5 wrench attack" examined in Chapter 5, remains outside cryptography's domain.^8^ Cryptography protects data, not people.

Quantum Computing Threats

Quantum computers threaten current public-key cryptography. Shor's algorithm, running on a sufficiently powerful quantum computer, could break RSA and elliptic curve cryptography by efficiently solving the mathematical problems they rely on.^9^

The current status varies by cryptographic type. Asymmetric cryptography (RSA and ECC) is vulnerable to quantum attack; current quantum computers are not powerful enough, but "harvest now, decrypt later" is a rational strategy for patient adversaries. Symmetric cryptography is less affected; Grover's algorithm provides only quadratic speedup, so doubling key lengths (e.g., AES-256 instead of AES-128) maintains security. Hash functions are similarly less affected; quantum computers provide modest speedup but do not fundamentally break them.

Post-quantum cryptography offers a path forward: new algorithms based on different mathematical problems (lattices, hash-based signatures, codes, multivariate equations) are under development and standardization.^10^ Hash-based signatures (such as XMSS and LMS) are particularly notable because their security relies only on the properties of hash functions, which are well understood and less affected by quantum computing. However, hash-based signatures are typically stateful: they require careful tracking of which one-time keys have been used, and reusing a key is catastrophic. This state management requirement introduces operational complexity that may limit their applicability.

The transition to post-quantum cryptography is a major infrastructure project but is technically feasible.

What Cryptography Cannot Solve

Cryptography cannot solve endpoint security; if the device is compromised, cryptography on that device is meaningless. It cannot hide metadata; encryption hides content but not the fact of communication, and who talks to whom, when, and how often remains visible without additional protection (see Chapter 14). It cannot prevent coercion, as physical force can compel key disclosure. It cannot solve social problems or make people trustworthy, only make certain betrayals detectable. It cannot establish key authenticity, as mathematics cannot tell you if the public key belongs to its purported owner.

Cryptography is a tool. It solves specific problems. Expecting it to solve problems beyond its scope leads to false confidence.

Chapter Summary

Cryptography provides mathematical foundations for privacy technology. Symmetric cryptography enables efficient encryption when keys are shared; asymmetric cryptography solves key distribution by allowing secure communication without prior shared secrets. In practice, hybrid systems use both.

Hash functions create fixed-size fingerprints of data, enabling integrity verification. Digital signatures combine hashing with asymmetric cryptography to provide authentication, integrity, and non-repudiation. Signatures enable trustless verification: anyone can verify independently without relying on intermediaries.

Cryptographic trust differs from institutional trust. Mathematical properties are consistent, transparent, independent, and scalable where institutional trust varies with personnel, politics, and incentives. But mathematical trust has limits: key authenticity must be established through other means, implementations can be flawed, computational assumptions could fail, and humans remain the weakest link.

Vulnerabilities include implementation bugs (more common than cryptographic breaks), side-channel attacks exploiting physical implementation, human error and social engineering, and quantum computing threats to current asymmetric cryptography. The transition to post-quantum algorithms is underway.

Cryptography solves specific problems: confidentiality of content, authentication of origin, integrity of data. It cannot solve endpoint compromise, metadata exposure, physical coercion, or key authenticity. Understanding both capabilities and limitations is essential for effective privacy protection.


Footnotes

^1^ Eric Hughes, "A Cypherpunk's Manifesto" (1993), available at https://www.activism.net/cypherpunk/manifesto.html.

^2^ Whitfield Diffie and Martin E. Hellman, "New Directions in Cryptography," IEEE Transactions on Information Theory 22, no. 6 (1976): 644-654.

^3^ Diffie and Hellman, "New Directions in Cryptography."

^4^ Ronald L. Rivest, Adi Shamir, and Leonard M. Adleman, "A Method for Obtaining Digital Signatures and Public-Key Cryptosystems," Communications of the ACM 21, no. 2 (1978): 120-126.

^5^ Neal Koblitz, "Elliptic Curve Cryptosystems," Mathematics of Computation 48, no. 177 (1987): 203-209; Victor S. Miller, "Use of Elliptic Curves in Cryptography," Advances in Cryptology - CRYPTO '85 (1985): 417-426.

^6^ National Institute of Standards and Technology, Secure Hash Standard (SHS), FIPS PUB 180-4 (Gaithersburg, MD: NIST, 2015).

^7^ For digital signature foundations, see Shafi Goldwasser, Silvio Micali, and Ronald L. Rivest, "A Digital Signature Scheme Secure Against Adaptive Chosen-Message Attacks," SIAM Journal on Computing 17, no. 2 (1988): 281-308.

^8^ The "$5 wrench attack" refers to the observation that physical coercion is often easier than cryptanalysis. See XKCD comic 538, "Security."

^9^ Peter W. Shor, "Algorithms for Quantum Computation: Discrete Logarithms and Factoring," Proceedings 35th Annual Symposium on Foundations of Computer Science (1994): 124-134.

^10^ National Institute of Standards and Technology, "Post-Quantum Cryptography Standardization," ongoing project, https://csrc.nist.gov/projects/post-quantum-cryptography.


Precious chapter: nostr:naddr1qqgxycfjvyck2vrpxvunycfcxsuk2q3qklkk3vrzme455yh9rl2jshq7rc8dpegj3ndf82c3ks2sk40dxt7qxpqqqp65welgguq

Next Chapter: nostr:naddr1qqgrydphv9nrgcfcxg6ryvrzx56kzq3qklkk3vrzme455yh9rl2jshq7rc8dpegj3ndf82c3ks2sk40dxt7qxpqqqp65wsc2nyl