Humankind has engaged technology to do more with less labor for as long as we have lived. In fact, somewhere between 2.5 and 3.4 million years ago, the common ancestors of humans and primates began to make and use crude tools. It took at least another million years before our early ancestors first learned to control fire; widespread use of fire for cooking is relatively recent, somewhere between 50 to 100,000 years ago.
It was about 12,000 years ago that hunter-gathers began to use cultivated agriculture for food.
Somewhere around 30,000 years later, in the 9th century, gunpowder and early machinery
marked a new technological era. A thousand years later, in the late 18th and early 19th century,
the industrial revolution vastly accelerated our use of technology for production of goods.
Mechanical calculators became commercially viable in the mid-1800’s, early analog computers
came about 100 years later.
I remember first using a time-share computer terminal in my high school in 1968. Personal
computers became commercially viable in the early-mid 1980’s, with the World Wide Web
coming into its glory in the following decades. The new millennium, beginning only 17 years
ago, was marked by widespread adoption of the smart phone, which seemed like a tremendous
upgrade from the old PDAs. We bragged that our phones now had more computing capacity
than the computers that guided moonshots. Computer networks and new and cheaper storage
and processing exploded computer capacities. Now, we are now at the cusp of a new
revolution based on quantum computing.
Throughout this several million year history, each technological revolution was met with both
excitement and dread. Each time, the social, economic, and environmental impacts were
profound. Disruptions were real. Family life changed. Work changed. Community changed.
Political structures changed. Wars grew in scope and devastation. People migrated. There
were winners and losers. And yet, somehow, the species adapted. It took time to adjust, but we
survived and thrived.
Today’s artificial intelligence revolution isn’t leaving us much adjustment time.
In 1984, Shoshana Zuboff, a Harvard Business School Professor, wrote In the Age of the Smart
Machine: The Future of Work and Power. She described how the new machines may be used
to empower—she called that “informate”—or to control, demean, and impoverish. Her book
echoed the challenges faced in all our prior technological revolutions and foreshadowed the
artificial intelligence debate today: Will technology create opportunities that we cannot even
imagine and free us to pursue more lofty ambitions or take away our jobs and livelihood, our
privacy, and our very dignity?
We have already experienced significant manufacturing job losses due to automation. A Ball
State University study attributes 85 percent of U.S. manufacturing job losses to robotics and other
productivity improvements, and only 13 percent due to trade.
The impacts are now moving to technical, analytical, and even managerial roles. According to a
study done at Oxford University, over the next 10 to 20 years, 47 percent of all U.S. jobs are vulnerable to
automation. Consider this observation by software developer Martin Ford, who was quoted in a
December New Yorker review of his recent book: “A computer doesn’t need to replicate the
entire spectrum of your intellectual capability in order to displace you from your job; it only
needs to do the specific things you are paid to do.”
James Manyika, a senior partner at McKinsey and Company, agrees that almost half of the
activities we pay people about $16 trillion in wages to do in the global economy have the
potential to be automated using currently demonstrated technology. He believes that the most
automatable activities involve data collection and manipulation as well as physical work in
predictable environments including manufacturing, food services, transportation, warehousing,
and retail. These sectors make up 51 percent of U.S. employment activities and $2.7 trillion in U.S.
wages. And he adds that about 25 percent of what CEOs do, such as analyzing reports and data,
could be replaced.
Manyika believes that in the short to medium term, more jobs will be changed than those fully
automated away. And several other experts observe that even the eliminated jobs may be
balanced by those created in enterprises yet to be determined. Consider, for example, that
Google didn’t exist 20 years ago, and now its parent company Alphabet has a market cap of
over $575 billion.
Elizabeth Kolbert comments in her New Yorker book review: “Picture the entire industrial
revolution compressed into the life span of a beagle.” Is this a good thing? More specifically, to
paraphrase Zuboff, will the AI revolution “infomate” or denigrate? That question will be the topic
of the next blog in this series.
—Barton Alexander, Principal,
Alexander & Associates LLC