Damus
Max Hillebrand profile picture
Max Hillebrand

The Praxeology of Privacy ~ Chapter 1: The Nature of Privacy

Article header image

Privacy is selective disclosure, not hiding. Breaking adversary observation through the OODA loop is strategic defense. Cheap privacy defeats expensive surveillance.

#The Praxeology of Privacy

"Privacy is the power to selectively reveal oneself to the world."

Eric Hughes^1^

Introduction

"If you have nothing to hide, you have nothing to fear."

This argument appears whenever privacy is discussed. It seems intuitive, even obvious. Innocent people, the reasoning goes, have no reason to object to surveillance. Only those with something to conceal, presumably something wrong, would resist transparency. The argument appeals to common sense and shifts the burden of proof onto privacy advocates: why do you need privacy unless you are doing something you should not be doing?

The argument deserves a serious response. This book provides one.

The "nothing to hide" argument fails, but not for the reasons typically offered. The standard responses, that everyone has embarrassing secrets or that surveillance creates a chilling effect, concede too much ground. They accept the framing that privacy is about hiding, differing only on whether what is hidden is legitimate.

This book takes a different approach. It examines privacy through two independent intellectual traditions: Austrian praxeology and cypherpunk cryptography. Both traditions, developed separately and for different purposes, arrive at compatible conclusions about privacy's importance. Austrian economists established the logical case through deductive analysis of human action. Cypherpunks demonstrated technical achievability through working code. Their convergence is not coincidental.

The full answer to "nothing to hide" requires the complete argument developed across this book. Chapter 21 returns to this question explicitly, tracing the response through three foundational axioms, economic analysis, and practical implementation. For now, it suffices to note that the argument rests on a fundamental confusion: it conflates privacy with secrecy, treating selective disclosure as if it were concealment of wrongdoing.

This chapter establishes what privacy means, distinguishes it from related concepts, and previews the argument to come.

1.1 Defining Privacy: Control Over Disclosure

Rigorous analysis requires precise definitions. "Privacy" suffers from definitional ambiguity that undermines serious discussion. The term is used to mean different things in different contexts, leading to arguments at cross purposes.

Eric Hughes, in his 1993 Cypherpunk's Manifesto, provided the definition this book adopts:

"Privacy is not secrecy. A private matter is something one doesn't want the whole world to know, but a secret matter is something one doesn't want anybody to know. Privacy is the power to selectively reveal oneself to the world."

This definition has several important features.

First, privacy is about control, not concealment. The private individual does not hide from everyone but chooses what to reveal, to whom, and under what circumstances. A person discussing medical conditions with their doctor, salary with their spouse, or political views with trusted friends exercises privacy. They are not hiding; they are selecting their audience.

Second, privacy is active, not passive. It requires the capacity to make disclosure decisions and the ability to implement those decisions. Privacy exists when the individual can effectively control information flow. When that control is removed, whether by surveillance, compelled disclosure, or technical vulnerability, privacy is lost regardless of whether any information is actually accessed.

Third, privacy concerns information about oneself. The definition addresses personal information: health, finances, relationships, beliefs, plans, preferences. It does not extend to all information generally. This distinction becomes important in later chapters when examining property rights and information economics.

Why does this definition matter? Because imprecise definitions enable the "nothing to hide" conflation. When privacy is vaguely understood as "keeping things secret," the argument that only wrongdoers need secrets seems plausible. When privacy is precisely understood as selective disclosure, the argument dissolves. Everyone engages in selective disclosure constantly. Sharing different information with different people is not evidence of wrongdoing; it is ordinary human behavior.

The stakes extend beyond semantic clarity. Legal scholar Daniel Solove has documented how the "nothing to hide" argument, when examined carefully, "shifts the debate to its terms, then draws power from its unfair advantage."^2^ The argument defines privacy narrowly as concealment, excludes other concerns from consideration, and then declares victory within its artificially constrained frame. Precise definition exposes this maneuver.

1.2 Privacy vs. Secrecy vs. Anonymity

Three concepts require careful distinction: privacy, secrecy, and anonymity. They overlap but are not identical, and conflating them produces confusion.

Privacy is selective disclosure. The private individual controls what information about themselves is shared and with whom. Privacy is compatible with extensive sharing; it requires only that the sharing be voluntary and controlled. A public figure who carefully manages their media presence exercises privacy even while being widely known.

Secrecy is non-disclosure. The secret is kept from everyone, or nearly everyone. Secrecy is a subset of privacy in that maintaining secrets requires control over information, but it goes further: secrecy aims at total concealment rather than selective revelation. Trade secrets, classified information, and surprise parties involve secrecy. They are not just private but actively hidden.

Anonymity is acting without attribution. The anonymous actor performs actions that cannot be linked to their identity. Anonymity concerns the connection between action and actor rather than information about the actor per se. A person may act anonymously while being publicly known in other contexts; the anonymous donor, the pseudonymous author, and the masked voter exercise anonymity regarding specific actions while potentially being public figures otherwise.

These concepts relate but do not reduce to each other.

Privacy without anonymity is common. Most privacy occurs within identified relationships: the patient identified to their doctor, the employee identified to their employer, the citizen identified to their bank. Privacy in these contexts means controlling what information flows through the relationship, not concealing one's identity.

Anonymity without privacy is possible but unstable. The anonymous actor who leaves identifiable traces may be deanonymized through correlation. Pure anonymity requires not just unlinking action from identity but preventing information leakage that enables later linking. In practice, anonymity requires privacy to be durable.

Secrecy requires both privacy and often anonymity. Maintaining secrets requires controlling information (privacy) and often concealing the very existence of the secret or one's connection to it (anonymity). The secret agent needs both: privacy about their activities and anonymity regarding their role.

The "nothing to hide" argument conflates these categories destructively. It treats privacy as if it were secrecy, implying that anyone wanting privacy must be concealing something. It ignores that selective disclosure, the ordinary management of personal information, is neither secretive nor suspicious. The argument further ignores anonymity's role in enabling speech, commerce, and political participation that might otherwise be chilled by attribution.

Common examples illustrate the conflation's costs. The person who uses curtains is exercising privacy, not plotting crimes. The journalist who protects sources exercises anonymity to enable truthful reporting. The patient who expects medical confidentiality exercises privacy about health information that is not wrongful to have. None of these involve wrongdoing; all involve legitimate control over personal information.

1.3 Privacy as Strategic Defense: The OODA Loop

Privacy operates as a strategic defense against adversarial action.

Military strategist John Boyd, a United States Air Force colonel, developed a model of adversarial decision-making known as the OODA loop: Observe, Orient, Decide, Act.^3^ Boyd's insight, derived from analyzing why American pilots dominated Korean War aerial combat despite facing comparable aircraft, was that all adversarial action follows this cycle. The adversary must first observe the target, gathering information about position, capabilities, and vulnerabilities. They must then orient, analyzing the information to understand the situation and identify opportunities. They must decide on a course of action. Finally, they must act, executing the chosen response. The cycle then repeats as the adversary observes the results and adjusts.

Boyd recognized that disrupting any stage of this loop degrades the adversary's effectiveness. But the stages are not equal. Breaking the loop at the Observe stage is uniquely powerful because it prevents all subsequent stages from occurring. An adversary who cannot observe cannot orient, cannot decide, cannot act. The entire attack cycle collapses before resources are committed. The later the disruption occurs, the more resources the adversary has already invested and the more options remain available to them.

Privacy breaks the OODA loop at its earliest and most vulnerable point. If an adversary cannot observe your finances, they cannot analyze your spending patterns, cannot decide to investigate, cannot act to seize or control. If they cannot observe your communications, they cannot orient on your relationships and plans, cannot decide whom to target, cannot act on intelligence they do not possess. If they cannot observe your location, movements, and associations, the entire apparatus of surveillance and control operates blindly.

This explains why privacy is strategic, not merely personal. The cost asymmetry favors the defender. Comprehensive surveillance is expensive: it requires infrastructure, personnel, storage, and analysis capabilities. Privacy tools, properly implemented, can be cheap: a cryptographic key costs nothing to generate but may require nation-state resources to break. The defender who breaks observation imposes costs on the attacker while bearing minimal costs themselves. This asymmetry is why states work so aggressively to prevent privacy: it negates their observational advantage before they can bring other resources to bear.

Chapter 10 develops this framework in detail, showing how state surveillance follows the OODA pattern and how each intervention type attempts to restore observational capability. Chapter 19 applies it to practical operational security: the first priority is always to prevent observation, because everything else follows from that failure.

1.4 Overview of the Book's Argument

This book argues that privacy is defensible on multiple independent grounds, that it can be technically implemented, and that doing so enables forms of human coordination otherwise impossible. The argument proceeds through several stages.

The Three Axioms

Part II establishes three foundational axioms, each with a different logical status. The Action Axiom (Chapter 3) is self-evident and descriptive, establishing privacy as inherent to purposeful behavior. The Argumentation Axiom (Chapter 4) provides normative foundations through Hoppe's argumentation ethics. The Axiom of Resistance (Chapter 5) is a well-grounded assumption about technical possibility. These distinctions matter; the chapters ahead develop each axiom's precise status and implications.

Economic Foundations

Part III applies Austrian economic analysis to privacy. Chapter 6 establishes that information content cannot be property because it is non-scarce; privacy is protected not through information-as-property claims but through self-ownership, physical property rights, and contract, applying Stephan Kinsella's framework on intellectual property. Chapter 7 examines exchange theory, showing how privacy enhances exchange by protecting deliberation, negotiation, and confidential terms, and how exchange can occur under surveillance but is distorted by it. Chapter 8 analyzes privacy infrastructure as capital goods, applying Austrian capital theory and entrepreneurship. Chapter 9 develops monetary theory, establishing requirements for sound money and bridging to Bitcoin.

The Adversary

Part IV examines threats to privacy. Chapter 10 analyzes state surveillance using Murray Rothbard's intervention typology. Chapter 11 examines corporate surveillance and data extraction. Chapter 12 traces the Crypto Wars, the ongoing conflict over cryptographic freedom.

Technical Implementation

Part V demonstrates that privacy is technically achievable. Chapter 13 covers cryptographic foundations. Chapter 14 examines anonymous communication networks including Tor. Chapter 15 analyzes Bitcoin as resistant money. Chapter 16 introduces zero-knowledge proofs. Chapter 17 examines decentralized social infrastructure.

Praxis

Part VI addresses practical implementation. Chapter 18 draws lessons from historical projects. Chapter 19 covers operational security. Chapter 20 provides individual implementation strategy. Chapter 21 synthesizes the argument, presents the cryptoanarchist vision, and answers "nothing to hide" fully.

What This Book Claims and Does Not Claim

Intellectual honesty requires stating limitations.

This book claims that privacy is a structural feature of human action. This is descriptive and, given the Action Axiom's self-evident nature, established.

This book argues that privacy cannot be coherently denied in rational discourse. This depends on Hoppe's argumentation ethics. Chapter 4 presents the argument, develops its implications for privacy, and addresses major objections.

This book assumes that technical resistance is possible. This assumption is well-grounded but not proven. Cryptographic security rests on mathematical conjectures (like P not equaling NP) that remain unproven. Implementations can fail. Humans can be coerced. The Axiom of Resistance enables analysis but does not guarantee outcomes.

This book does not claim that privacy solves all problems, that technology substitutes for political action, or that parallel economies will inevitably replace states. It examines what is possible, what is defensible, and how to implement it. Whether these possibilities become reality depends on choices made by individuals.

Chapter Summary

Privacy is selective disclosure: the power to control what information about oneself is revealed and to whom. This definition, drawn from Eric Hughes, distinguishes privacy from both secrecy (non-disclosure to anyone) and anonymity (acting without attribution). Privacy is about control, not concealment. The private individual chooses what to reveal, to whom, and under what circumstances.

The "nothing to hide" argument conflates these categories, treating selective disclosure as if it were concealment of wrongdoing. Its full refutation requires the complete analysis developed across this book and is provided in Chapter 21.

Privacy operates as strategic defense through Boyd's OODA loop framework. Breaking the adversary's decision cycle at the Observe stage is uniquely powerful because it prevents all subsequent stages. An adversary who cannot observe cannot orient, decide, or act. The cost asymmetry favors the defender: comprehensive surveillance is expensive, while privacy tools can be cheap. This explains why states work aggressively to prevent privacy: it negates their observational advantage before they can bring other resources to bear.

This book develops its argument through three axioms with different logical statuses: the Action Axiom (self-evident, establishing privacy as structural feature of action), the Argumentation Axiom (normative foundation for privacy rights), and the Axiom of Resistance (well-grounded assumption about technical possibility). Praxeological analysis demonstrates how privacy enhances exchange, enables economic calculation, and connects to sound money. Technical chapters demonstrate that privacy is achievable through cryptography, anonymous networks, Bitcoin, zero-knowledge proofs, and decentralized protocols. The synthesis shows how these components create the possibility of economic coordination outside surveillance infrastructure.


Footnotes

^1^ Eric Hughes, "A Cypherpunk's Manifesto" (1993). Hughes was a mathematician and programmer who co-founded the Cypherpunks mailing list and articulated the movement's philosophical foundations.

^2^ Daniel J. Solove, "'I've Got Nothing to Hide' and Other Misunderstandings of Privacy," 44 San Diego Law Review 745 (2007). Solove's analysis examines how the "nothing to hide" argument frames the privacy debate on terms that exclude consideration of legitimate privacy concerns. See also Solove, Nothing to Hide: The False Tradeoff Between Privacy and Security (Yale University Press, 2011).

^3^ John Boyd developed the OODA loop concept through a series of briefings, especially "Patterns of Conflict" (1986) and "The Strategic Game of ? and ?" (1987). Boyd never published his ideas in traditional academic form; they circulated through military briefings and were later compiled by associates. For an accessible introduction, see Frans Osinga, Science, Strategy and War: The Strategic Theory of John Boyd (Routledge, 2007). Boyd's work influenced U.S. military doctrine, including the Army's AirLand Battle doctrine and concepts of maneuver warfare.


Previous chapter: nostr:naddr1qqgrvvpjx33kgvtrxgen2vfcvcurjq3qklkk3vrzme455yh9rl2jshq7rc8dpegj3ndf82c3ks2sk40dxt7qxpqqqp65wfepgy5

Next Chapter: nostr:naddr1qqgxzctpxvenverrx93rwv35v3snzq3qklkk3vrzme455yh9rl2jshq7rc8dpegj3ndf82c3ks2sk40dxt7qxpqqqp65wkhswjd