Damus
Max Hillebrand profile picture
Max Hillebrand

The Praxeology of Privacy ~ Chapter 12: The Crypto Wars

Article header image

The Crypto Wars pit states against privacy technology. Mathematics ignores legislation. Developers face prosecution. The fundamental conflict is permanent and intensifying.

#The Praxeology of Privacy

"If privacy is outlawed, only outlaws will have privacy."

Phil Zimmermann^1^

Introduction

The Crypto Wars are the ongoing conflict between states seeking surveillance capability and individuals developing privacy technology. The conflict began when strong cryptography moved from classified military research to civilian availability. States that had monopolized unbreakable encryption suddenly faced citizens with the same capability. The response was predictable: attempts to control, restrict, and backdoor cryptographic technology.

These attempts largely failed, for reasons economic analysis explains. But failure was not total, and the conflict continues.

12.1 History: Export Controls to Clipper Chip

Cryptography as Munitions

Until the late 1990s, the United States classified strong cryptographic software as munitions under the International Traffic in Arms Regulations (ITAR).^2^ Sharing encryption algorithms with foreign nationals required the same export licenses as shipping missiles. Academic researchers who published cryptographic papers faced potential prosecution for arms trafficking.

The classification reflected Cold War assumptions: cryptography was military technology, and maintaining cryptographic superiority over adversaries justified restricting civilian access. That civilians might have legitimate privacy needs, independent of military considerations, did not factor into the regulatory framework.

The absurdity became apparent as computing proliferated. Mathematical formulas available in university libraries required munitions licenses for email distribution. The same algorithm was legal to discuss in a conference talk but illegal to send as a text file. Researchers could legally publish papers that anyone could implement but could not legally distribute working implementations.

Phil Zimmermann and PGP

Phil Zimmermann's Pretty Good Privacy (PGP) crystallized the conflict.^3^ In 1991, Zimmermann released PGP as free software, providing strong public-key cryptography to ordinary users. PGP spread rapidly via the early internet, soon reaching users outside the United States.

The federal government opened a criminal investigation. For three years, Zimmermann faced potential prosecution for arms trafficking. The case became a cause célèbre in the nascent internet community. Zimmermann's response was characteristically cypherpunk: he published the PGP source code as a printed book, which enjoyed First Amendment protection that software files did not.^4^

The case was eventually dropped without charges, but it established the template for Crypto Wars conflicts: the government asserts control authority; technologists route around restrictions; the restrictions prove unenforceable; eventually formal policy catches up with technical reality.

The Clipper Chip

In 1993, the NSA proposed the Clipper Chip: a cryptographic chipset with government-mandated key escrow.^5^ Users would have encryption, but government agencies would hold duplicate keys enabling decryption when legally authorized.

The proposal combined encryption with surveillance. Proponents argued this balanced privacy against law enforcement needs. Opponents identified fundamental problems.

Escrowed keys create a single point of failure; if the escrow database were compromised, all protected communications would be exposed simultaneously. The scheme also required trusting government agencies to access keys only when legally authorized, and given revelations about warrantless surveillance, this trust was not warranted. Researcher Matt Blaze discovered a flaw in Clipper's protocol allowing users to disable escrow functionality while maintaining encryption.^6^ The system designed to ensure government access could be trivially circumvented. Finally, technology companies recognized that products with government backdoors would lose international markets; customers seeking actual privacy would choose products without mandated vulnerabilities.

Clipper was never formally abandoned but quietly faded as market rejection made it commercially unviable. The episode demonstrated that mandated backdoors face both technical and economic obstacles.

Resolution of the 1990s Battles

By the late 1990s, the first Crypto Wars were winding down. The combination of legal challenges, commercial pressure, and practical unenforceability led to policy liberalization. In 1996, Executive Order 13026 began transferring encryption controls from the State Department to Commerce Department. The 1999 Bernstein decision held that source code was protected speech, undermining the regulatory framework.^7^

Export controls were substantially relaxed, though not eliminated. Strong cryptography became legal to distribute, implement, and use. The infrastructure for encrypted communication that we now take for granted became possible.

12.2 The Economic Logic of Cryptographic Control

Why States Seek Control

States seek cryptographic control because encryption threatens surveillance capability. The reasons connect to core state functions.

States extract resources through taxation, and comprehensive financial surveillance enables tax enforcement; encrypted financial transactions, invisible to authorities, undermine enforcement capability. As Chapter 10 examined, monetary systems enable state control, and encrypted payment systems circumvent that control by enabling transactions outside monitored channels. States monitor populations for various purposes: identifying dissent, tracking movements, understanding social networks. Encryption creates spaces invisible to such monitoring. Investigating crimes often requires accessing communications and records, and encryption can prevent access even with legal authority.

From the state's perspective, encryption is a capability problem. Citizens with strong encryption can act without state visibility. This constrains state action regardless of whether that action is legitimate law enforcement or illegitimate repression.

Information Asymmetry and State Power

States benefit from information asymmetry: knowing more about citizens than citizens know about states. This asymmetry enables selective enforcement, chilling effects, and preemptive intervention. When authorities can see all violations but must choose which to prosecute, enforcement becomes discretionary; everyone is guilty of something, and prosecution depends on official favor. Knowledge of surveillance changes behavior, as citizens who know they are watched modify actions to avoid attention, even when those actions are legal. Early detection of organizing, dissent, or resistance enables intervention before movements gain strength.

Encryption reduces information asymmetry. States see less; citizens can coordinate without visibility. The power that asymmetry provides is diminished.

Economic Stakes

The economic stakes are substantial on both sides. For states, surveillance infrastructure represents massive investment; intelligence agencies, law enforcement, and tax authorities have built capabilities premised on access, and encryption threatens the return on that investment. For citizens, privacy enables economic activity that surveillance would prevent; underground economies, regulatory arbitrage, protection of competitive information, and simple preference for non-observed life all have value to those who want them. For businesses, the tension runs in both directions: governments demand access while customers demand privacy, and the commercial value of serving privacy-conscious customers conflicts with regulatory compliance.

12.3 Why Control Fails (and Where It Doesn't)

Mathematics Is Indifferent to Law

Cryptographic security rests on mathematical properties that legal prohibition cannot change. If a problem is computationally hard, it remains hard regardless of what legislators decree. No law can make factoring large primes easy.

This creates fundamental enforcement problems. Cryptographic knowledge can be independently discovered; suppressing knowledge in one jurisdiction does not prevent discovery elsewhere, for the mathematical relationships exist whether anyone knows them or not. Once published, mathematical knowledge cannot be unpublished; academic papers, textbooks, and internet archives preserve cryptographic techniques permanently and globally. Given published algorithms, implementation is straightforward for competent programmers, and suppressing implementations requires suppressing programming itself.

Near-Zero Marginal Cost

Information replication costs nearly nothing. A cryptographic algorithm, once discovered and published, can be copied infinitely at negligible cost. This makes control efforts scale poorly. If one copy escapes control, it can become unlimited copies; the mathematics of encryption can spread faster than authorities can track. The internet enables global distribution faster than national enforcement can respond, and software published in one jurisdiction is available worldwide within minutes.

Global Coordination Problem

Effective cryptographic control would require global coordination among states with divergent interests. Not all states want to restrict encryption; some benefit from serving as havens for privacy technology development. Each state controls only its own territory, so a law requiring backdoors in one country does not affect software developed elsewhere. Even coordinated international law faces enforcement gaps, and motivated actors can find jurisdictions that do not participate or do not enforce.

Where Control Succeeds

Despite these obstacles, cryptographic control is not entirely ineffective. Cryptography is hard to implement correctly, and most users cannot evaluate whether implementations are secure; this creates opportunities for compromised implementations to spread. Secure systems often sacrifice usability, so when encryption is hard to use, people use it less or use it incorrectly, undermining security. Most users accept defaults, and systems with weak default encryption or no encryption by default leave most users unprotected regardless of what strong options exist. Companies operating within jurisdictions must comply with local law or face sanction, and major platforms often implement surveillance capabilities because regulatory compliance requires it. Finally, encryption protects data, not people; physical coercion can compel key disclosure regardless of cryptographic strength, and the "$5 wrench attack" remains effective.

Control fails against sophisticated, motivated actors. It often succeeds against ordinary users who lack expertise, motivation, or awareness to implement strong encryption.

12.4 Jurisdictional Competition and Arbitrage

Different Jurisdictions, Different Rules

Jurisdictions compete for economic activity, including technology development. Privacy-friendly jurisdictions can attract developers, companies, and users alike. Programmers prefer working where their work is legal, and encryption development has clustered in jurisdictions with favorable legal treatment. Businesses serving privacy-conscious customers locate where they can legally do so; Switzerland, Estonia, and other jurisdictions have attracted privacy-focused technology companies. High-value users seeking privacy can choose service providers based on jurisdiction, and demand for offshore services reflects regulatory arbitrage.

Voting with Their Feet

Economist Charles Tiebout analyzed how competition among jurisdictions for residents creates pressure toward policies residents prefer.^8^ Applied to cryptography, Tiebout's analysis illuminates several dynamics. Developers and companies can relocate, and the threat of exit constrains jurisdictional policy. Jurisdictions known for privacy protection attract privacy-seeking activity, and this reputation becomes an asset worth maintaining. When some jurisdictions offer favorable treatment, others face pressure to match or lose economic activity.

Race Dynamics

Jurisdictional competition can race toward privacy protection or toward surveillance, depending on which pressures governments respond to. Governments responding to law enforcement and intelligence pressures may compete to offer more surveillance capability, racing toward the bottom. Governments responding to economic development pressures may compete to offer more privacy protection, racing toward the top. The outcome depends on which pressures dominate. Currently, evidence suggests mixed dynamics: some jurisdictions competing on privacy while others expand surveillance.

12.5 The Ongoing War

The Escalating Attack on Privacy Developers

The Crypto Wars did not end with 1990s liberalization. They have intensified. What distinguishes the current phase is the direct prosecution of developers, entrepreneurs, and privacy advocates. Building privacy tools has become personally dangerous.

The pattern is unmistakable. Ross Ulbricht received two life sentences plus forty years for operating Silk Road, a sentence exceeding those for violent crimes.^12^ The message was clear: enabling private commerce carries extreme penalties.

Tornado Cash developers faced prosecution for writing open-source code. Alexey Pertsev was arrested in the Netherlands in 2022 and convicted for money laundering in 2024, sentenced to over five years in prison for developing a coinjoining protocol.^13^ Roman Storm was arrested in the United States on similar charges. The prosecution theory held that writing privacy-preserving code constitutes money laundering, regardless of whether the developer participated in any underlying transaction. Roman Sterlingov received a twelve-year sentence for allegedly operating Bitcoin Fog, a coinjoining service.

These are not isolated cases. They represent systematic targeting of privacy infrastructure developers. The legal theories expand with each prosecution: writing code becomes money laundering; offering privacy features becomes operating an unlicensed financial service; enabling transactions the state cannot see becomes conspiracy.

The chilling effect is intentional. When developers face decades in prison for building privacy tools, fewer developers build privacy tools. When entrepreneurs are arrested for enabling private transactions, fewer entrepreneurs enter the space. The prosecutions target not just the individuals but the entire ecosystem of privacy development.

Current Regulatory Threats

Beyond prosecution, regulatory pressure continues through familiar channels. Proposals for "responsible encryption" with "exceptional access" continue to emerge; the arguments are similar to Clipper, and the technical problems remain. Holding platforms liable for user content creates incentives to surveil users and undermines end-to-end encryption that would prevent such surveillance. Regulation targeting cryptocurrency, including KYC requirements, exchange registration, and travel rules, extends financial surveillance to new domains and criminalizes non-compliant services. Proposals for international frameworks to govern encryption seek to close jurisdictional arbitrage opportunities.

The "Going Dark" Debate

Law enforcement agencies argue they are "going dark": losing access to communications that encryption protects. The FBI, in particular, has campaigned for mandatory access capabilities.^9^

Critics respond that access has expanded rather than contracted; despite encryption, law enforcement has more access to more data than ever before, with metadata, location tracking, and platform cooperation providing vast information streams. Every security expert who has examined the question concludes that mandated access introduces vulnerabilities.^10^ Furthermore, access capabilities created for law enforcement tend to expand to intelligence agencies, foreign governments, and eventually to hackers who compromise the access mechanisms.

The debate continues without resolution. Law enforcement wants access; technologists explain why secure access is technically impossible; legislators periodically attempt mandates that would undermine security.

Post-Quantum Concerns

Quantum computing threatens current public-key cryptography. A sufficiently powerful quantum computer could break RSA and elliptic curve cryptography that secure most current internet traffic.^11^

This creates both threat and opportunity. Encrypted data captured today could be decrypted later when quantum computers mature, making "harvest now, decrypt later" a viable strategy for patient adversaries. At the same time, post-quantum cryptography is under active development; the transition to quantum-resistant algorithms is a major infrastructure project, but technically feasible. States may use the quantum transition as an opportunity to mandate backdoors in new cryptographic standards.

The Conflict Continues

The Crypto Wars are not over. They are a permanent feature of the relationship between states and citizens with access to strong cryptography.

The fundamental dynamic remains: states want surveillance capability; citizens want privacy; cryptography can provide privacy that resists state surveillance; states therefore seek to constrain cryptography.

Neither side can permanently win. Cryptography cannot be uninvented; state power cannot be abolished. The conflict continues because both sides have durable interests and neither can eliminate the other.

Chapter Summary

The Crypto Wars are the ongoing conflict between states seeking surveillance capability and individuals developing privacy technology. The conflict began when strong cryptography moved from military exclusivity to civilian availability, threatening state surveillance capabilities.

The first Crypto Wars (1990s) saw cryptography classified as munitions, criminal investigation of PGP's Phil Zimmermann, and the failed Clipper Chip proposal for mandatory key escrow. These control efforts largely failed due to constitutional challenges, commercial pressure, and technical unenforceability. By the late 1990s, export controls were substantially relaxed.

States seek cryptographic control because encryption threatens surveillance capability essential for tax enforcement, monetary control, population monitoring, and law enforcement. Encryption reduces the information asymmetry that state power depends upon.

Control efforts face fundamental obstacles. Mathematics is indifferent to legal prohibition; computational hardness does not respond to legislation. Information replication costs nearly nothing; one escaped copy becomes unlimited copies. Global coordination would be required but is practically impossible. However, control succeeds where it targets implementation difficulty, usability barriers, institutional compliance, and physical coercion. Sophisticated actors can defeat control; ordinary users often cannot.

Jurisdictional competition creates arbitrage opportunities. Developers and companies relocate to favorable jurisdictions; privacy-friendly policies attract economic activity; competitive pressure constrains aggressive surveillance policies in jurisdictions responsive to economic development concerns.

The Crypto Wars continue and have intensified. The current phase is marked by direct prosecution of privacy developers: Tornado Cash developers imprisoned for writing coinjoining code, Ross Ulbricht serving life sentences for operating a private marketplace. The legal theories expand with each case, treating code as money laundering and privacy features as criminal conspiracy. Beyond prosecution, current threats include renewed backdoor mandates, platform liability that incentivizes surveillance, expanding cryptocurrency regulation, and international coordination efforts. The "going dark" debate continues without resolution. Quantum computing threatens current cryptography while creating opportunity for new regulatory interventions. The fundamental conflict between state surveillance interests and citizen privacy interests is permanent, and the stakes for those who build privacy tools have never been higher.


Footnotes

^1^ Phil Zimmermann, "Why I Wrote PGP," originally published in the PGP User's Guide (1991, updated 1999). Available at https://www.philzimmermann.com/EN/essays/WhyIWrotePGP.html.

^2^ For comprehensive history of U.S. cryptographic export controls, see Whitfield Diffie and Susan Landau, Privacy on the Line: The Politics of Wiretapping and Encryption (Cambridge, MA: MIT Press, 1998).

^3^ Phil Zimmermann, "Why I Wrote PGP," in PGP User's Guide (1991), available at https://www.philzimmermann.com/EN/essays/WhyIWrotePGP.html.

^4^ The printed source code strategy exploited the distinction between software (potentially regulable) and books (protected speech). MIT Press published the source code in 1995.

^5^ On the Clipper Chip proposal and controversy, see A. Michael Froomkin, "The Metaphor Is the Key: Cryptography, the Clipper Chip, and the Constitution," University of Pennsylvania Law Review 143, no. 3 (1995): 709-897.

^6^ Matt Blaze, "Protocol Failure in the Escrowed Encryption Standard," Proceedings of the 2nd ACM Conference on Computer and Communications Security (1994): 59-67.

^7^ Bernstein v. United States Department of Justice, 176 F.3d 1132 (9th Cir. 1999), holding that software source code is protected speech under the First Amendment.

^8^ Charles M. Tiebout, "A Pure Theory of Local Expenditures," Journal of Political Economy 64, no. 5 (1956): 416-424.

^9^ James Comey, "Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?" speech at the Brookings Institution, October 16, 2014.

^10^ See, e.g., Harold Abelson et al., "Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications," Journal of Cybersecurity 1, no. 1 (2015): 69-79.

^11^ For overview of post-quantum cryptography, see Daniel J. Bernstein and Tanja Lange, "Post-Quantum Cryptography," Nature 549 (2017): 188-194.

^12^ United States v. Ross William Ulbricht, 14-cr-68 (S.D.N.Y. 2015). Ulbricht was sentenced to two life terms plus forty years without possibility of parole.

^13^ Pertsev was convicted by a Dutch court in May 2024. The case established that developers can be held criminally liable for how others use open-source privacy tools.


Precious chapter: nostr:naddr1qqgrydekxumxvvm9xp3n2vtr89jrxq3qklkk3vrzme455yh9rl2jshq7rc8dpegj3ndf82c3ks2sk40dxt7qxpqqqp65wh5z6qg

Next Chapter: nostr:naddr1qqgxzdt9xcexvef5v3jrsefexsunzq3qklkk3vrzme455yh9rl2jshq7rc8dpegj3ndf82c3ks2sk40dxt7qxpqqqp65wund9cr