How Privacy Makes You Feel

June 9, 2014

Privacy is a challenge because it’s so personal. That’s columnist Esther Dyson’s conclusion for why data’s value and our feelings about its use often seem so out of whack. We want to be known and get something in return, but not feel violated along the way. This is an incredibly hard line to walk in practice, but that’s just what companies today have to do.

How we think about privacy comes down to two parts: knowledge and feeling. In other words, what do other people know about me and how do I feel about it? On the knowledge side, most of us understand the concrete interests at play with our data and its use. “For many ordinary users,” Dyson says, “such [data] recognition can be valuable and appreciated, even if it is not accompanied by special gifts or benefits.” The feeling side gets a little hazier. Some feel that ignorance is bliss. Others are more share-happy. And some are basically shouting “get off my lawn.”

Those range of feelings we have about privacy come from the relational nature of data use. As Dyson finds, “By using a service, a person effectively enters into a relationship with its provider. Service providers should understand that their users will expect the transparency, respect, and recognition that are fundamental to any relationship.”

It’s surprisingly hard for companies to sort out who values what facts or feelings the most. They could just ask, of course. Dyson believes it’s plausible to pose the question, “How may we use the information you share with us?”, but she sees companies as being rather gun-shy.

In fact, I think there’s a deeper issue: data-driven companies (practically every company these days) see secondary and tertiary uses for the information they assemble. There are often serendipitous, valuable uses that stretch beyond what the company, let alone the customer could have realized from the beginning of the “data relationship.” This means that simply posing a question about data’s initial use won’t quite do. Rather, the personal relationship that data elicits must be ongoing.

For companies to get their relationships with consumers right means understanding how consumers feel about privacy. And it seems to me that what has them concerned isn’t so much the collection of data by the private sector, but the fear of real harms befalling them. Commercial data practices (and public policy responses) should focus then on defining the sources and reach of consumer harm—and then placing them off-limits.

The long-term success of the data-driven economy is bound up in how we tackle the privacy challenge. Dyson is right in saying that the unsettled nature of the privacy debate seems to paint a bulls-eye on company bottom-lines. That’s why her conclusion is worth pondering:

“The challenge is thus to build systems that feel appropriately personal, without making users feel violated or uncomfortable. At the same time, they should not be presented as being more human than they are. Otherwise, users’ expectations will become disconnected from what the company can deliver.”

We know what’s at stake. Data are fueling growth across our economy—improving outcomes and boosting productivity—while helping us address problems that have defied solutions for years. Consumers are ultimately benefiting from these gains. Better decisions with better data means a better future.

 

See also: The Private Sector's Privacy Puzzle