Decision Making, Data Processing, and Job Creation in America

August 15, 2013

In a newly released report, Banning Garrett at the Atlantic Council looks at how algorithms are shaping decision making, data processing, and job creation in today’s America.

Algorithms are, according to Garrett’s definition, “Sets of rules for processing data to produce outcomes.” They are pervasive and significant forces in the “digital ecosystem” and the “list of systems in modern society depending on algorithms is virtually inexhaustible.”

While microprocessors have made tremendous gains since the start to the information revolution, they are practically left in the dust compared to advancements in algorithms. “While processor speeds improved by a factor of 1,000 [between 1988 and 2003],” Garrett explains, “algorithm performance improved by an astounding 43,000-fold over that same period.”

Algorithms are crucial to Big Data, working to sort through vast troves and streams of information to determine what’s most valuable and what should instead be tossed aside. They are especially useful for finding unforeseen connections and determining the best applications of what's found—all of it fully automated, of course.

The other key application is in the so-called “Internet of Things,” or IoE, which the McKinsey Global Institute believes will have a potential economic impact of around $2.7 trillion and $6.2 trillion in 2025. IoE connects everyday objects, large or small, to computer networks. Its development is driven by a new Internet address standard boasting 100 IP addresses for every atom on the Earth’s surface. By assigning objects with an IP address and sensor, streams of data will begin pouring forth for algorithms to mine.

In an algorithm-driven world like this, privacy concerns will only increase. There’s also the chance for deliberate misuse. As Garrett notes, “Spurious correlations can affect an individual’s future.” Here’s what he later mentions as being a key limitation with algorithms in Big Data:

The results are based on correlations, not causality. That is, certain factors are correlated with certain outcomes but without discovering a causal relationship. The algorithms analyzing the data do not explain why things happen but only that they correlate with each other. “Of course, causality is nice when you can get it,” according to Viktor Mayer-Schonberger and Kenneth Cukier, but, they insist, correlations are often good enough and they can be found “far faster and cheaper than causality.” The authors of Big Data note that algorithm driven big data analysis correlations can predict flu outbreaks, translate languages, and enable cars to drive themselves. But correlations can also lead to mistaken analyses and harmful decisions based on the “fallacy of numbers” and the “dictatorship of data.” [emphasis mine]

Job creation is another challenge that Garrett identifies—and it lacks an easy answer:

Algorithms are eliminating many jobs, and this is expected to be a long-term, structural trend. The jobs that will be lost are not only those of assembly-line workers and those doing menial, repetitive tasks, but also professionals such as lawyers, doctors, psychiatrists, and writers as well as those of truck drivers and musicians. … The long-term possible implications are the elimination of jobs by algorithms with a widening gap between the top and the bottom of society and increasing “technological unemployment” as many well-paying jobs just disappear. Whether they will be replaced by entirely new, well-paying jobs is uncertain and worrisome to many analysts. [emphasis mine]

Security is a final key concern with algorithms, especially for America’s critical infrastructure. The chance that hostile actors will undermine our electronic infrastructure is very real. Algorithms are introducing a greater fat tail risk—meaning that most disruptions will be minor right up until the point where damage accumulates exponentially and quickly gets out of hand.

The response to this concern should be to build greater resilience into our critical infrastructure. One way to do that is to purposely set up unconnected systems that “require a separate infrastructure to instrument and monitor them.” Not only would they act as a fail-safe, but also enable continuous testing and checkup of the radically complex software programs running these algorithms. The irony of course is that in being so effective at drawing connections, algorithms require even more investment in systems that they can’t connect to.

The gains from algorithms are so large though—realizing the fully potential for basic software systems as well as enabling the spread of our most advanced networks—that I doubt any of these concerns will stand in their way. Our modern world is supported by algorithms and there’s no turning back.