The Praxeology of Privacy ~ Chapter 11: Corporate Surveillance and Data Extraction
Corporate surveillance extracts behavioral data for prediction products. State and corporate surveillance are deeply entangled. Markets are responding to growing privacy demand.
"If you are not paying for it, you're not the customer; you're the product being sold."
Andrew Lewis^1^
Introduction
States are not the only surveillance threat. Corporate data extraction has created comprehensive monitoring infrastructure that rivals and often exceeds state capabilities.
This chapter applies praxeological analysis to corporate surveillance: the business model of data extraction, the entanglement between corporate and state surveillance, whether this constitutes market failure, and how markets are beginning to respond to privacy demand.
11.1 The Business Model of Data Extraction
Users as Product
Shoshana Zuboff coined the term "surveillance capitalism" to describe a business model where human experience is claimed as free raw material for translation into behavioral data.^2^ This data feeds prediction products traded in behavioral futures markets.
The economic logic is straightforward: services appear free because users pay with data, not money. The advertiser is the customer; the user is the product. More precisely, predictions about user behavior are the product; users are the source of raw material for manufacturing those predictions.
This inverts the traditional market relationship. In ordinary exchange, businesses compete to serve customers. In data extraction, businesses compete to capture users. The difference matters: serving customers requires satisfying their preferences; capturing users requires keeping them engaged regardless of whether engagement serves their interests.
Attention Harvesting
Data extraction businesses compete for attention. User engagement generates data; more engagement generates more data; maximizing engagement maximizes the raw material for prediction products.
This creates incentives for manipulation. If engagement serves user interests, no problem arises. But when engagement conflicts with user interests (addictive design, outrage amplification, rabbit holes of increasingly extreme content), the business model rewards manipulation over service.
Praxeology emphasizes demonstrated preference: what people actually choose reveals their preferences. But demonstrated preference assumes unmanipulated choice. When choice architecture is designed to exploit psychological vulnerabilities, "choice" becomes less revealing. The user who spends hours scrolling may not be revealing preference for scrolling; they may be revealing susceptibility to variable reward schedules.
Behavioral Surplus
Zuboff distinguishes between data necessary for service improvement and "behavioral surplus" extracted for prediction products. A map application needs location data to provide directions; that same location data, accumulated over time and correlated with other data, becomes raw material for predicting future behavior and selling those predictions to advertisers.
The surplus concept highlights that users receive services worth some fraction of the data they provide. The remainder, the surplus, is extracted without compensation. Users cannot easily assess how much surplus is extracted because they cannot observe how their data is used, combined, or sold.
This information asymmetry compounds the problem. In ordinary markets, competition drives prices toward marginal cost. In data extraction markets, users cannot comparison shop based on data extraction because they cannot observe extraction practices. Competition therefore occurs on other dimensions (features, network effects), not privacy protection.
Prediction Products
Data extraction businesses sell predictions. Advertisers pay for likely-to-click users; political campaigns pay for likely-to-persuade voters; insurers pay for likely-to-claim customers. The value lies in prediction accuracy; accuracy improves with more data; more data requires more extensive surveillance.
This creates an extraction ratchet. Each improvement in prediction accuracy makes data more valuable, justifying more intensive extraction, enabling more accurate predictions, making data even more valuable. The endpoint, approached asymptotically, is comprehensive behavioral monitoring to support comprehensive behavioral prediction.
11.2 Corporate-State Entanglement
Legal Requirements
Chapter 10 examined triangular intervention: state mandates imposed on private transactions. Data extraction businesses face such mandates. Data retention requirements force companies to keep data they might otherwise delete. Lawful interception requirements force communication providers to build surveillance backdoors. Reporting requirements force platforms to monitor for specified content.
These requirements shape corporate data practices. A company might prefer to minimize data collection for security and liability reasons; legal requirements may mandate collection. The state uses corporate infrastructure as force multiplier, achieving surveillance scope impossible through direct government operation.
Voluntary Cooperation
Beyond legal requirements, many corporations cooperate voluntarily with government requests. The PRISM program revealed major technology companies providing direct access to user data.^3^ National Security Letters compel disclosure while prohibiting recipients from acknowledging the request.
Voluntary cooperation creates business opportunities. Government contracts reward companies with surveillance capabilities. Intelligence agencies represent well-funded customers for prediction products. The line between serving advertisers and serving intelligence agencies becomes blurred when both want the same behavioral predictions.
The Public-Private Partnership
State and corporate surveillance have become symbiotic. States benefit from corporate data collection that would face legal barriers if conducted directly. Corporations benefit from legal frameworks that entrench their business models while burdening competitors.
Consider how this operates. Corporations collect data at scale that governments could not legally mandate; once collected, governments access that data through legal process, national security letters, or informal cooperation. Corporations develop analytical tools such as machine learning and pattern recognition for commercial purposes, and these same tools serve government surveillance. Corporations build the networks, devices, and platforms through which communication flows, enabling governments to monitor at the infrastructure level, not the endpoint level. Large corporations can afford compliance with complex privacy regulations while smaller competitors cannot; regulations intended to protect privacy instead create moats protecting incumbents.
Why the Distinction Matters Less Than It Appears
The state-corporate distinction matters for legal purposes. Constitutional constraints apply to government action, not private action. But for privacy analysis, the distinction matters less.
If your communications are monitored, the practical effect is the same whether monitored by NSA or Google. If your behavior is predicted and manipulated, it matters little whether the manipulator is a government propaganda agency or a social media algorithm. The loss of privacy is the loss; the identity of the surveilling party is secondary.
This suggests that privacy protection must address both state and corporate surveillance. Technical measures effective against one may be effective against both. But legal measures effective against government surveillance (constitutional constraints, warrant requirements) do not reach corporate surveillance directly.
11.3 Is This a Market Failure?
The Market Failure Claim
Critics argue that surveillance capitalism represents market failure. Users do not want comprehensive surveillance but get it anyway. Companies extract negative externalities (privacy costs) without bearing them. Markets fail to produce privacy-respecting alternatives.
If true, this would justify intervention to correct the failure. Privacy regulations, data ownership rights, platform breakups might be warranted to restore functioning markets.
The Austrian Response
Austrian economics is skeptical of market failure claims for several reasons. First, there is the knowledge problem: diagnosing market failure requires knowing the optimal outcome markets should produce, but optimal outcomes emerge through market process and cannot be known in advance. What appears to be failure may be markets discovering solutions to problems regulators have not even identified. Second, current market structure reflects decades of intervention; claiming markets have failed ignores that markets have not been tried. The question is whether observed outcomes result from markets or from interventions distorting markets. Third, static analysis may identify apparent inefficiencies that dynamic analysis reveals as temporary. Markets may be in process of correcting the problem; intervention may arrest that correction.
The Role of State Intervention
Current surveillance capitalism structure reflects substantial state intervention. Copyright and patent laws enable the platform monopolies that dominate data extraction; without IP protection, code and algorithms would face competition that limits monopoly power, and Facebook's network effects matter less if competitors can implement compatible features. GDPR and similar regulations impose compliance costs that large incumbents can absorb but smaller competitors cannot, potentially reducing some data extraction practices while entrenching the extractors themselves. Major technology companies derive substantial revenue from government contracts, creating incentive to develop surveillance capabilities governments want to purchase, which are then deployed against users generally. Section 230 and similar provisions shield platforms from liability for user content while enabling them to curate that content, combining the privileges of both publisher and distributor while bearing full responsibility as neither. Banking regulations requiring know-your-customer and anti-money-laundering compliance push economic activity toward tracked digital channels and away from private cash, creating the data streams that surveillance capitalism extracts.
Would a Free Market Produce This?
The counterfactual is difficult to assess, but several considerations suggest current outcomes are not inevitable market results. Without IP protection creating artificial scarcity in software, competition would be more intense; platforms could not maintain network effects by threatening compatible alternatives with patent lawsuits. If users could easily switch between compatible platforms, data extraction would face competitive pressure, and users who value privacy could migrate to privacy-respecting alternatives without losing network connections. Without monetary system surveillance pushing transactions toward tracked channels, alternative payment methods would reduce the data streams that make behavioral prediction valuable.
This does not prove that free markets would produce perfect privacy. Network effects, coordination problems, and real consumer preferences for free services would still exist. But it suggests current surveillance intensity reflects intervention as much as market outcome.
Network Effects and Lock-In
Even granting that intervention plays a role, real market dynamics contribute to surveillance concentration. Communication platforms are more valuable with more users, creating winner-take-all dynamics where a few platforms dominate and making exit costly for users. Users have invested in learning platforms, building connections, and creating content; switching means abandoning those investments. Even if users prefer privacy-respecting alternatives, coordination failure may prevent migration, as each user waits for others to switch and no one wants to be first to an empty platform.
These are real market phenomena, not intervention effects. They suggest that even absent intervention, privacy competition might face obstacles. But they do not justify further intervention; they suggest the problem requires technical and entrepreneurial solutions, not regulatory ones.
11.4 Market Responses and Privacy Competition
Privacy as Competitive Differentiator
Despite obstacles, markets have begun responding to privacy demand. The response takes several forms: established companies differentiating on privacy, new entrants building privacy-first business models, and infrastructure changes that constrain data extraction regardless of individual company choices.
Apple's privacy differentiation provides the clearest large-scale demonstration. In April 2021, Apple introduced App Tracking Transparency (ATT), requiring apps to request permission before tracking users across other companies' apps and websites. The result was dramatic: approximately 80% of iOS users opted out of tracking when given a clear choice.^7^ Meta reported that ATT would reduce its 2022 revenue by approximately $10 billion; industry estimates placed the total cost to Meta closer to $13 billion annually.^8^
This single policy change revealed the fragility of surveillance-dependent business models. When users were given a simple choice, the vast majority chose not to be tracked. The preference was always there; it required only a mechanism for expression. Apple profited from revealing it; companies dependent on surveillance suffered. This is market discovery operating at scale.
Search and browser alternatives demonstrate similar dynamics. DuckDuckGo has grown from a niche search engine to processing billions of queries annually, despite competing against the most sophisticated search infrastructure in history. Users accept less sophisticated results in exchange for privacy; the trade-off reveals how much privacy matters. Brave browser has reached tens of millions of users by combining privacy protection with attention-based advertising that compensates users rather than extracting from them.
The Rise of Encrypted Messaging
End-to-end encrypted messaging has achieved mainstream adoption more completely than perhaps any other privacy technology.
Signal's growth trajectory illustrates the market discovery process. In January 2021, following WhatsApp's announcement that it would share more data with Facebook, Signal's servers crashed under the load of new users. From approximately 40 million active users in early 2022, Signal grew to 70 million by 2024. Revenue grew from $8 million in 2021 to over $25 million in 2024, supported entirely by donations rather than data extraction or advertising.^9^ A nonprofit organization competing against the most well-resourced technology companies demonstrates that privacy can sustain alternative business models.
WhatsApp itself, despite Meta ownership, uses the Signal protocol for end-to-end encryption. The decision was defensive: without encryption, WhatsApp would lose users to encrypted alternatives. Even surveillance-dependent companies must provide some privacy features to remain competitive. This is market pressure operating through competition, not through regulation.
The encryption adoption pattern reveals something about how markets discover privacy demand. Encryption was once expert-only technology requiring manual key exchange and careful configuration. Signal made encryption default and invisible; users benefit without needing to understand the technology. The lesson: privacy tools must be as convenient as surveillance alternatives to achieve adoption. Usability, not just security, determines market success.
Paid vs. Ad-Supported Models
The "free" services model depends on data extraction for revenue. Paid models offer an alternative: users pay with money, not data. This realignment is structural. When users are customers rather than products, business incentives align with user interests rather than against them.
Subscription services have grown across categories. Streaming video offers ad-free tiers; users revealed preference for paying to avoid surveillance-enabling ads. News paywalls remove the advertising incentive to maximize engagement regardless of content quality. Productivity software subscriptions have displaced advertising-supported tools, changing incentive structures across the software industry.
Not all subscription services respect privacy; paid products can still extract data. But the paid model removes the structural incentive that makes data extraction the core business rather than an incidental practice. A company whose revenue comes from subscriptions has no structural reason to maximize data collection; a company whose revenue comes from prediction products has every reason.
The premium tier pattern, appearing across products and services, suggests growing willingness to pay for privacy and reduced surveillance. Users who once accepted "free" services now pay for alternatives that better align with their interests. This revealed preference guides entrepreneurial discovery of further privacy-respecting products.
Privacy Infrastructure
Beyond individual products, infrastructure changes are beginning to constrain data extraction structurally. DNS-over-HTTPS prevents ISPs from observing and monetizing browsing data. Default encryption in transit, now standard across the web, prevents casual interception. Hardware security modules in consumer devices make certain types of data extraction technically impossible.
These infrastructure changes differ from product competition in important ways. Product choice requires active user decisions; infrastructure changes protect users who make no choice at all. Default privacy is more powerful than opt-in privacy because it protects the vast majority who never adjust settings.
The shift toward privacy-protective defaults reflects market discovery at the infrastructure level. Companies that control infrastructure (browser makers, operating system vendors, device manufacturers) have discovered that privacy features provide competitive advantage. Google implementing privacy features in Chrome, despite Google's advertising business, illustrates the competitive pressure: if Chrome does not provide privacy features, users migrate to browsers that do.
The Limits of Market Response
Market responses are real but face structural obstacles. Network effects favor established platforms; users cannot easily switch when their contacts remain on surveillance platforms. The discovery process is slow; many users remain unaware of alternatives. Privacy products often remain harder to use than surveillance alternatives, limiting adoption to those who specifically prioritize privacy.
Moreover, market response addresses only some dimensions of the surveillance problem. Companies can compete on privacy for functions where privacy-respecting alternatives exist. But market competition cannot address government surveillance requirements, infrastructure-level monitoring, or the accumulation of data by entities that face no market pressure to delete it.
The Austrian perspective does not claim that markets solve all problems instantly. It claims that markets discover preferences through entrepreneurial experimentation and that competition tends toward serving those preferences over time. Privacy market development is early-stage. The trajectory points toward greater privacy competition, but the process is incomplete.
Market Discovery
Markets discover preferences through entrepreneurial experimentation. Privacy preferences were latent until entrepreneurs created products that revealed them. Apple did not know that 80% of users would reject tracking until ATT gave them the choice. Signal did not know that millions would adopt encrypted messaging until improvements in usability made adoption feasible.
This discovery process has no predetermined endpoint. Entrepreneurs continue experimenting. Some experiments fail; others reveal preferences no one anticipated. The market for privacy is being discovered through the same process by which markets discover all preferences: trial, error, and competition.
Current privacy tools are early-stage. They are harder to use, less feature-rich, less networked than surveillance alternatives. This is typical of nascent competition. Early automobiles were worse than horses on many dimensions; entrepreneurs improved them until they dominated. Early mobile phones were worse than landlines on voice quality and reliability; improvements made them indispensable.
Privacy technology follows a similar trajectory. Each generation of tools is easier, more capable, more competitive with surveillance alternatives. The process is incomplete but ongoing. Markets have not solved the privacy problem; they are in the process of discovering how to solve it.
Chapter Summary
Corporate surveillance operates through data extraction: users provide raw material (behavioral data) that is processed into prediction products sold to advertisers and others. This inverts the traditional market relationship where businesses serve customers; in data extraction, businesses capture users.
Corporate and state surveillance have become entangled. Legal requirements force companies to collect and retain data. Voluntary cooperation provides government access to corporate data. The public-private partnership achieves surveillance scope neither party could accomplish alone. For privacy purposes, the state-corporate distinction matters less than it appears.
Whether surveillance capitalism represents market failure is contested. Austrian analysis emphasizes that current outcomes reflect substantial intervention: intellectual property creating platform monopolies, regulations creating compliance moats, government contracts incentivizing surveillance development. Whether free markets would produce similar outcomes is unclear, but intervention has shaped current structure.
Markets are responding to privacy demand. Apple differentiates on privacy. Encrypted messaging has achieved mainstream adoption. Paid services offer alternatives to ad-supported data extraction. This market discovery process is incomplete but demonstrates that privacy preferences exist and can be served.
The analysis neither condemns markets nor exonerates them. Markets respond to incentives; current incentives are shaped by intervention as much as consumer preference. Technical and entrepreneurial solutions may succeed where regulatory solutions would entrench existing surveillance infrastructure.
Footnotes
^1^ Andrew Lewis (user "blue_beetle"), comment on MetaFilter, August 26, 2010. This observation, often misattributed, captures the economic logic of advertising-supported services.
^2^ Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: PublicAffairs, 2019), 8-11. Zuboff provides comprehensive documentation of surveillance capitalism's development and practices, though her analysis differs from the Austrian perspective applied here.
^3^ The PRISM program was revealed through documents leaked by Edward Snowden in 2013. See Glenn Greenwald, No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State (New York: Metropolitan Books, 2014).
^4^ On network effects and platform economics, see David S. Evans and Richard Schmalensee, Matchmakers: The New Economics of Multisided Platforms (Boston: Harvard Business Review Press, 2016).
^5^ On the Austrian critique of market failure theory, see Israel M. Kirzner, "The Perils of Regulation: A Market-Process Approach," in Discovery and the Capitalist Process (Chicago: University of Chicago Press, 1985), and Murray N. Rothbard, Power and Market, 4th ed. (Auburn, AL: Ludwig von Mises Institute, 2006), chapter 3.
^6^ On Section 230 and platform immunity, see Jeff Kosseff, The Twenty-Six Words That Created the Internet (Ithaca, NY: Cornell University Press, 2019).
^7^ During Q3 2021, approximately 80% of iOS users opted out of tracking on major social media platforms following ATT implementation. See Statista, "App Tracking Transparency: Opt-In Rate of iOS Users 2022," and Flurry Analytics, "App Tracking Transparency Opt-In Rate: Monthly Updates."
^8^ Meta CFO Dave Wehner stated the estimated $10 billion revenue impact during the Q4 2021 earnings call, February 2022. See "Facebook Says Apple iOS Privacy Change Will Result in $10 Billion Revenue Hit This Year," CNBC, February 2, 2022. Lotame estimated the total impact at $12.8 billion.
^9^ Signal user and revenue statistics from Signal Technology Foundation tax filings and Business of Apps, "Signal Revenue & Usage Statistics (2025)." The Signal Foundation operates as a 501(c)(3) nonprofit funded primarily by donations.
Precious chapter: nostr:naddr1qqgrsdmpvv6n2wfhv5ervdmpxa3kxq3qklkk3vrzme455yh9rl2jshq7rc8dpegj3ndf82c3ks2sk40dxt7qxpqqqp65wecsu7m
Next Chapter: nostr:naddr1qqgxycfjvyck2vrpxvunycfcxsuk2q3qklkk3vrzme455yh9rl2jshq7rc8dpegj3ndf82c3ks2sk40dxt7qxpqqqp65welgguq