Privacy, Transparency and Data

April 29, 2014
General Foundation

How should we look at privacy and transparency with Big Data? Having attended all three of the White House’s Big Data workshops, I found a few clues.

For one thing, public and private sectors have different responsibilities. With transparency, Fred Cate of Indiana University argued that the government should tend toward openness and accountability in collecting data. He said that “it’s about the relationship of citizens to their government.”  Transparency serves as a necessary “restraint on power.” Americans end up better protected against government overreach.

The private sector’s uses of data, on the other hand, are too differentiated for transparency to be a relevant standard. Commercial data practices center more on giving users freedom of choice and tools to prevent or limit the collection and use of data. Moreover, companies are obligated to guard trade secrets and operational details from their competitors. This makes transparency an especially difficult standard to follow. Responsibility is also unclear since most private sector data are going to be created or inferred, not collected.

Cate believes that the private sector’s use of Big Data requires a broader and more settled paradigm, such as harm. That’s what we’re ultimately concerned about. Our response can then be one of defining what constitutes a harmful use of data and placing that practice off-limits. Risk becomes the central variable then, not data. Companies should be able to address harm at the point when people may experience risk, and then be able to seek redress rather than trying to judge data in advance or in piecemeal form.

For customers, their real interest may not be seen in privacy, per se, but in what Ben Wittes called “databuse,” or “the unjustified deployment of user data in a fashion adverse to the user’s interests.” Transparency wouldn’t be a panacea under this norm. Instead, the harm standard would look at the “proposed, collection, handling, and deployment of data and what obligation the third party collector should incur to prevent those harms.”

Erika Rottenberg, formerly of LinkedIn, agreed that harm is the right paradigm, supported by clarity with and protection of user data. LinkedIn is not interested in losing the trust of its members, she said. It does the company no good to suffer data breaches or surprises. Even though they have different roles and responsibilities, that is why regulators and companies are looking at data from the same standpoint. They are each asking how to best protect consumer’s interests and what their reasonable expectations are.

In the end, one point was made clear at each of the workshops: Any fear of the public sector’s use of Big Data shouldn’t impact our regulations on the private sector.