Guarding Against Data Discrimination in the Internet of Everything

September 22, 2015
Vice President and Chief Research and Policy Officer, Multicultural Media, Telecom, and Internet Council

Takeaways

Aggregated data can be immensely useful, but not when it's used to prey on vulnerable populations.

This essay is part of a series of articles relating to the Internet of Everything project. Read more at uschamberfoundation.org/ioe

 

By Nicol Turner-Lee

The Internet of Everything (IoE) is dominating the digital landscape through the infiltration of broadband-enabled sensors and devices, including those that are wearable. As IoE data is systemically retrieved, processed and archived, it is imperative that the private sector understand its implications on consumer privacy and where data profiling can emerge, particularly as the predictive analytics entrenched in Big Data potentially influence systematic bias towards certain groups.

Big Data, the aggregation of online consumer behaviors, does not always have negative consequences. Gathering trends on children’s eating behaviors can generate solutions for tackling obesity. Information collected about household energy uses proves useful to consumers and industry when conserving an already strained power grid. In both of these cases, Big Data can be leveraged to solve problems and generate solutions.

Aggregated data collection is less productive when it preys on more vulnerable populations, especially people of color, the poor, seniors, people with disabilities, and even millennials.

In a 2013 report, the Senate Commerce Committee identified unregulated data brokers as the offenders as they aggregate and, in many cases, profile consumer information to sell or share with others. Focusing on financial vulnerabilities, the report found that data brokers sorted low-income consumers into categories that included: “Rural and Barely Making It,” “Ethnic Second-City Strugglers,” and “Credit-Crunched: City Families.”

The Federal Trade Commission (FTC) recently cracked down on predatory data brokers, including Urban Scramble and Mobile Mixers, that combine and analyze consumer information to make potentially stereotypical inferences about preferences, habits and lifestyles based on race and income. To avoid regulatory consequences, businesses should ensure that their data is primarily used for marketing and development purposes and not for fueling negative consumer profiles.

Online vulnerabilities can also be attributed to the persistent digital divide that afflicts certain populations.  Fifty-five million Americans still lack access to broadband, and they tend to be lower income groups of color, including African Americans, Hispanics, American Indians, and Alaska Natives. According to new data from the Pew Research Center, only one in five African Americans and 18% of Hispanics do not use the Internet, compared with 14% of Whites and only 5% of English-speaking Asian Americans.  Recognizing the entry-level experiences of these online users should prompt businesses to be even more transparent when requesting and curating consumer data.

Both the private and public sectors can thwart many of the consumer privacy and data profiling concerns of IoE by understanding its intersection with Big Data.

For example, data compiled from a wearable fitness tracker may seem harmless as it’s leveraged to support the product’s innovation, but it can also increase the cost of the user’s health or life insurance if reporting the results to these companies are mandatory.

In a more extreme case, the same data might be used to infer the user’s credit or employment worthiness, since both discipline and conscientiousness are perceived traits of more “valuable employees.” Energy data collected in low-income, more transient communities can provide a more accurate assessment of consumption trends, but it can also validate the possible assessment of surcharges and other fees on these already impoverished consumers for their lack of efficiency.

These discriminatory practices and many other examples can potentially lead to the violation of civil and human rights. The national policy debate is energized around this long-term concern, and industry should be too. In a report issued this year, the FTC issued policy recommendations to monitor the civil rights consequences of IoE. Their main proposals focus on increased “privacy by design” measures that build security into IoE devices and sensors, reasonable oversight of unregulated data brokers, enhanced security measures and notifications during data breaches, greater transparency of data purposes and retention, and data minimization.

As the FTC looks to appropriately protect consumers online, it’s up to advocates for communities of color to work with them and the overarching industry members to make sure every community has a fair, respectful, and appropriate representation in the unfolding interconnected world and its economy. At a high level, many of the agency’s recommendations serve to curb potential infractions, and industry should take heed as they encourage inclusivity in the innovation and expansion of emerging IoE technologies.

Dr. Nicol Turner-Lee is Vice President and Chief Research and Policy Officer for the Multicultural Media, Telecom and Internet Council (MMTC).

The views expressed here are those of the author and do not necessarily reflect those of the U.S. Chamber of Commerce Foundation, U.S. Chamber of Commerce, or their affiliates.