Ethics and Privacy in the Data-Driven World

September 22, 2015
Jules Polonetsky
Executive Director, Future of Privacy Forum
Policy Counsel, Future of Privacy Forum

Takeaways

A formal policy on data ethics will allow companies to innovate while protecting individual privacy and security.

 

This essay is part of a series of articles relating to the Internet of Everything project. Read more at uschamberfoundation.org/ioe

By Jules Polonestky and Joseph Jerome

Each year, billions of new devices become newly connected, generating exabytes of data. As one former European consumer protection commissioner put it, personal information has become not just “the new oil of the Internet” but also “the new currency of the digital world.” However, this explosion of personal information raises new privacy issues that responsible companies need to confront head-on.

Longstanding Fair Information Practice Principles, which have governed how organizations handle personal information for decades, are increasingly becoming strained. Principles that focus on minimizing data collection and specifying exactly how information can be used are challenged by a world awash in data. More problematic, giving consumers adequate notice and choice has largely devolved into endless privacy policies that no one reads or understands. In a world of drones, smart lighting, and wearables, traditional notices are less and less practical. The path forward will require a flexible approach, and companies will need to be more creative about how they communicate with consumers and involve them in the new data economy.

Make no mistake: in sector after sector, society is seeing how the use of data provides real value to consumers and to society. Location information can be used to improve traffic flows, reduce wait times in store lines, help the blind navigate airports, and provide essential navigation information. Wearable devices can give us more granular information about ourselves than anyone could have imagined, and the spread of low-cost sensors is making everything from our living rooms to our cities “smart.”

Yet, with all of these new data-driven innovations, we will see a backlash if companies do not make a better case that all this information is being used in ways that helps consumers and society and that personal data is being adequately secured and protected.

Critics worry that the same beneficial information could also be used for unfair profiling, new forms of discrimination, and an Orwellian future. These concerns need to be addressed.

One important tool will be to rely on de-identification. Although researchers have been able to re-identify information from supposedly anonymized datasets in some cases, it would be a mistake to conclude that it is always easy to re-identify data or that de-identification is not a useful, privacy-protective practice. Data that is sufficiently aggregated or scrubbed of direct and indirect identifiers decreases the risks that personally identifiable information will be used for unauthorized, malicious, or otherwise harmful purposes. However, there remains little consensus on the science or law of de-identification, and many companies rely on detailed datasets that critics consider personal. Progress and consensus on effective de-identification will be critical to the success of the Internet of Everything.

Increasingly, having some sort of data ethics policy will be essential. In many cases, the collection and use of personal information is perfectly legal, but product improvements and valuable research may involve testing and experimenting on consumers. In recent years, we have seen flaps about A/B testing, leading the founder of dating-service OkCupid to declare, “We experiment on human beings!” In response, academics have called for companies to learn from the biomedical and behavioral sciences and establish ethical review processes to monitor and approve new and surprising uses of personal information.

Formalizing an ethical review process will give companies an outlet to weigh the benefits of data use against a larger array of risks. It provides a mechanism to formalize data stewardship and move away from a world where companies are largely forced to rely on the “gut instinct” of marketers or the C-Suite. By establishing an ethics policy, one can also capture issues that go beyond privacy issues and data protection, and ensure that the benefits of a future of smart devices outweigh any risks.

Jules Polonetsky is the executive director of the Future of Privacy Forum.

 

Joseph Jerome is policy counsel for the Future of Privacy Forum.

 

The views expressed here are those of the author and do not necessarily reflect those of the U.S. Chamber of Commerce Foundation, U.S. Chamber of Commerce, or their affiliates.