By James C. Cooper
Associate Professor of Law and Director, Program on Economics & Privacy
Antonin Scalia Law School, George Mason University
We all know that today’s economy lives on data—the information economy accounts for an estimated 6 percent of GDP. This means that privacy regulation, which is essentially about restricting these data flows, leaves a large footprint. One need only look to the hullabaloo generated by the recent Federal Communications Commission (FCC) broadband privacy rule and calls to block the proposed AT&T-Time Warner deal on privacy grounds. As such, it’s crucial that privacy regulators—chiefly the Federal Trade Commission (FTC)—get it right.
Lessons from Antitrust
Consider Facebook’s 2014 acquisition of WhatsApp. On the antitrust side, the FTC’s Bureau of Competition (BC) applied the cost-benefit framework laid out in the Horizontal Merger Guidelines (HMG) to determine that the combination of social media platforms didn’t pose a threat to competition. Fortunately for consumers, the FTC refused to heed calls from some to incorporate privacy concerns into antitrust and block the merger because it would give Facebook “too much” personal data. This would have been a mistake on a number of grounds: it ignores the benefits built into new uses of data in the form of richer and more personalized content, heterogeneous consumer preferences, and the First Amendment implications of undue restrictions on data flows. Modern antitrust wisely eschews non-competition concerns.
Juxtapose the BC’s antitrust framework with the FTC’s Bureau of Consumer Protection’s (BCP) response to the deal. Accompanying the FTC’s closing of its antitrust investigation into the acquisition was a letter from the BCP, cautioning that if WhatsApp failed to “obtain consumers’ affirmative consent” before using their data “in a manner that is materially inconsistent” with the promises it made at the time of collection it may be in violation of the FTC Act. Although requiring notice for a change in data sharing is not novel (assuming the changes in data use were material) and potentially could violate the FTC Act, the opt-in requirement sprung to life without any analysis. Yet, it’s probably not an exaggeration to say that this requirement—which later was made clear to apply to all merging firms—was only slightly less punitive than blocking the merger itself. What is more, this new opt-in requirement will create ripples far beyond Facebook and WhatsApp. It surely will deter some companies from engaging in beneficial changes to their current data collection practices and cause others to think twice before pursuing acquisitions.
The Benefit of a Benefit-Cost Framework
That the FTC developed a requirement with huge potential impact on the way data are used in such an ad-hoc manner—without any analysis of likely impact on consumers or competition—is troubling. As privacy regulation’s influence on the economy has grown, so must the sophistication of the analysis on which it relies. On this front, regulators should take a cue from antitrust’s evolution and adopt a benefit-cost framework—one that uses economic analysis to identify practices that are likely to harm consumers.
A good start would be for the FTC to begin to grapple with the available empirical evidence on consumers’ willingness to reveal data. Consumer harm is the legal trigger for FTC action. The FTC Act bans “unfair and deceptive acts and practices.” For an act or practice to be unfair it must fail a cost-benefit test, causing “substantial injury to consumers” that is unavoidable and isn’t outweighed by “countervailing benefits to consumers or competition.” For a practice to be deceptive under the FTC Act, there must be a “material” misrepresentation. In this manner, materiality is a proxy for harm, because the misrepresentation distorted consumers’ decisions. The FTC, however, needs to confront the lack of empirical evidence that the practices that are the focus of the FTC’s privacy regime are giving rise to the type of injury that Congress empowered it to address.
The FTC justifies its privacy program on the need to foster “trust” in the online environment, relying on various surveys to support the notion that consumers place a large value on privacy or feel that they have lost control of their data. But surveys, or what economists call “stated preference,” tell us only that privacy, like most other things, has value. It cannot answer the real question for policymakers: How willing are consumers to swap personal data for other things they value? These tradeoffs are what matter.
What Consumers Are Saying
Once the focus shifts to what economists call “revealed preference,” or how consumers actually make tradeoffs, the story becomes quite different. Far from suggesting that consumers are reticent to engage the online ecosystem, the real world behavior illustrates consumers who are largely comfortable with the tradeoffs they make in their digital lives. There are 1.3 billion daily Facebook users, 150 million people using Snapchat daily, a growing number of health tracking apps and wearables, and nearly half of U.S. households choosing to purchase an Amazon Prime account.
Of course revealed preference is useful only if consumers are informed about the tradeoffs they are making. Some argue that consumers simply don’t understand the costs associated with data sharing, and if they did their revealed preferences would look quite different. But the empirical literature suggests otherwise; economic studies that have attempted to measure the value of personal data nearly universally find that even when consumers are fully aware of the trades they are making, they are willing to provide personal information for small amounts of compensation, or alternatively are only willing to pay very little to avoid personal data collection. For example, one study finds that consumers are only willing to make a one-time $4 payment to avoid real-time geolocation tracking. Moreover, a recent study presented various versions of Google’s Gmail privacy policies to a random sample of representative Gmail users.  Although the subjects generally believed that Gmail's automated content analysis was intrusive, two-thirds were unwilling to pay anything to avoid the practice—they perceived some privacy cost, but it was not as large as the value of free email. For the rest of the subjects, the median annual payment they were willing to make to avoid email scanning was only $15.
Data in the Balance
The takeaway from all of this is not that privacy is valueless, or that certain types of data collection and use do not give rise to privacy concerns, but rather that most consumers are comfortable with the typical bargain of sharing information with faceless servers in return for free content and services, such as email and social networking platforms. These considerations raise serious questions for mainstays in the FTC’s privacy program such as data minimization and privacy by design. These slogans obscure the fact that tradeoffs exist between privacy and other characteristics that consumers value. Why not “speed” or “functionality” by design, or “cost minimization” or “performance maximization”? Until it confronts the empirical evidence, the FTC has not made the case that it, rather than the market, is better at mediating how consumers trade among competing values. Indeed, the FTC’s posture appears to be based on the preferred mix of privacy and functionality for the most privacy sensitive consumers. This posture could be welfare-enhancing only if consumers are incapable of making informed choices because they systematically underestimate privacy harms. If this is the case, the FTC should state their position clearly and engage in research to demonstrate what seems to be a necessary predicate for its regulatory agenda.
1 See Stephen Siwek, Measuring the U.S. Internet Sector, Internet Association (Dec. 10, 2015), available at http://internetassociation.org/wp-content/uploads/2015/12/Internet-Association-Measuring-the-US-Internet-Sector-12-10-15.pdf.
2 See, e.g., Lloyd Grove, The Perils of an AT&T-Time Warner Merger, The Daily Beast (Oct. 26, 2016), at http://www.thedailybeast.com/articles/2016/10/26/the-perils-of-an-at-t-time-warner-merger.html.
3 FTC, Big Data: A Tool for Inclusion or Exclusion: Understanding the Issues (Jan. 2016) (“Big Data Rep.”), at https://www.ftc.gov/system/files/documents/reports/big-data-tool-inclusion-or-exclusion-understanding-issues/160106big-data-rpt.pdf; FTC, Internet of Things: Privacy & Security in a Connected World (Jan. 2015)(“IOT Rep.”), at https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf; FTC, Data Brokers: A Call for Transparency and Accountability (May 2014)(“Data Brokers Rep.”), at https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014/140527databrokerreport.pdf; FTC, Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Businesses and Policymakers (March 2012)(“2012 Privacy Rep.”), at https://www.ftc.gov/reports/protecting-consumer-privacy-era-rapid-change-recommendations-businesses-policymakers.
4 See Douglas H. Ginsburg, Originalism and Economic Analysis: Two Case Studies of Consistency and Coherence in Supreme Court Decision Making, 33 Harv. J.L. Pub. Pol’y 217 (2015).
6 See Chelsey Dulaney, Facebook Completes Acquisition of WhatsApp, Wall St. J. (Oct. 4, 2014), at http://www.wsj.com/articles/facebook-completes-acquisition-of-whatsapp-1412603898.
7 See James C. Cooper, Privacy and Antitrust: Underpants Gnomes, The First Amendment, and Subjectivity, 20 Geo. Mason L. Rev. 1129 (2013).
8 Letter from Jessica L Rich to Erin Egan & Anne Hoge (Apr. 10, 2014), at https://www.ftc.gov/public-statements/2014/04/letter-jessica-l-rich-director-federal-trade-commission-bureau-consumer. The letter called for “affirmative express consent” for changes, and a blog posting from a BCP staffer later clarified that merging parties needed to obtain “express opt-in consent” for material changes to data practices. See Mergers and Privacy Policies (Mar. 25, 2015), at https://www.ftc.gov/news-events/blogs/business-blog/2015/03/mergers-privacy-promises.
10 See, e.g., Jin-Hyuk Kim & Liad Wagman, Screening Incentives and Privacy Protection in Financial Markets: A Theoretical and Empirical Analysis, 46 RAND J. Econ. (2015) (finding evidence that screening was less accurate when opt-in requirement for sharing was in place, because it limited revenue from selling to third parties); Avi Goldfarb & Catherine E. Tucker, Privacy Regulation and Online Advertising, 57 Mgm’t Sci. 57 (2011)(finding that EU privacy regulations governing web tracking, some of which require opt-in consent, reduce advertising effectiveness); id. at 60 (executives from large European companies report that it costs 15 Euros for each opt-in consent); Howard Beales, The Value of Behavioral Targeting (2010), available at http://www.networkadvertising.org/pdfs/Beales_NAI_Study.pdf.
11 The FTC has made recent commendable steps in this direction with PrivacyCon and the recent disclosure workshop that present relevant empirical work.
12 15 U.S.C. § 45.
13 15 U.S.C. § 45(n).
14 FTC Policy Statement on Deception, appended to Cliffdale Associates, Inc., 103 F.T.C. 110, 174 (1984), available at https://www.ftc.gov/public-statements/1983/10/ftc-policy-statement-deception.
16 See, e.g., IOT Rep. at 44 (“if a company decides that a particular data use is beneficial and consumers disagree with that decision, this may erode consumer trust.”); 2012 Privacy Rep. at 8-9 (“although it recognizes that imposing new privacy protections will not be costless, the Commission believes doing so not only will help consumers but also will built trust in the marketplace”).
17 Company Info, Facebook, http://newsroom.fb.com/company-info/ (last visited Oct, 10, 2016).
18 Sarah Frier, Snapchat Passes Twitter in Daily Usage, Bloomberg News, June 2, 2016, available at https://www.bloomberg.com/news/articles/2016-06-02/snapchat-passes-twitter-in-daily-usage.
19 By 2015, an estimated 500 million people worldwide will use a mobile health app. Stephen McInerney, Can You Diagnose Me Now? A Proposal to Modify the FDA's Regulation of Smartphone Mobile Health Applications with A Pre-Market Notification and Application Database Program, 48 U. Mich. J.L. Reform 1073 (2015) (citing Kevin Pho, Health App Users Beware, USA Today (Apr. 2, 2014), http://www.usatoday.com/story/opinion/2014/04/02/medical-app-fitness-health-fda-technology-column/7224837/). Andrew Meola, Wearables and Mobile Health App Usage has Surged by 50% Since 2014, Business Insider (Mar. 7, 2016) (health tracker use increased from 16% in 2014 to 33% in 2015), at http://www.businessinsider.com/fitbit-mobile-health-app-adoption-doubles-in-two-years-2016-3. See also Susannah Fox, The Self-Tracking Data Explosion, Pew Research Center (June 4, 2013), http://www.pewinternet.org/2013/06/04/the-self-tracking-data-explosion/.
20 See Krystina Gustafson, Half of America Could Have an Amazon Prime Account by the End of the Year, CNBC, at http://www.cnbc.com/2016/09/26/amazon-prime-signing-up-members-at-a-faster-clip.html.
21 For a full review of this literature see Alessandro Acquisti et al., The Economics of Privacy , J. Econ. Lit. at 41 (forthcoming, 2017), at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2580411.
22 Scott Savage & Donald M. Waldman, The Value of Online Privacy (2013), at
24 See, e.g., DesignerWare, LLC (Apr. 15, 2012), at https://www.ftc.gov/enforcement/cases-proceedings/112-3151/designerware-llc-matter. Indeed, the anonymity provided by the Web is likely to be a privacy benefit rather than a cost. See Benjamin Wittes & Jodie Liu, The Privacy Paradox: The Privacy Benefits of Privacy Threats, Center for Technology Innovation at Brookings (May 2015), http://www.brookings.edu/~/media/research/files/papers/2015/05/21-privacy-paradox-wittes-liu/wittes-and-liu_privacy-paradox_v10.pdf. See also, Stephanie Mathson & Jeffry Hancks, Privacy Please? A Comparison Between Self-Checkout and Book Checkout Desk for LGBT and Other Books, 4 J. Access Servs. 27, 28 (2007).
25 Amalia R. Miller & Catherine E. Tucker, Can Health Care Information Technology Save Babies?, 119 J. Pol. Econ. 289 (2011); Amalia R. Miller & Catherine E. Tucker, Privacy Protection and Technology Diffusion: The Case of Electronic Medical Records, 55 Mgm’t Sci. 1077 (2009).
26 Jin-Hyuk Kim & Liad Wagman, Screening incentives and Privacy Protection in Financial Markets: A Theoretical and Empirical Analysis, 46 RAND J. Econ. 1 (2015).
27 Avi Goldfarb & Catherine E. Tucker, Privacy Regulation and Online Advertising, 57 Mgm’t Sci. 57 (2011).
28 For a review of this literature see James C. Cooper, Separation Anxiety, __ Va. J.L. Tech. __ (forthcoming 2017).