Regardless of what sort of economic restoration or returns to national greatness politicians have been promising recently, the coming years will be ones of profound disruption and displacement. The macroscopic trends taking hold all around the world—ones that are certainly here to stay—won’t be undone. These changes, ushered in by technology’s growth in leaps and bounds, promise to make the rest of this century an interesting one.
We are, so to speak, weak in our efforts to prop up protectionist policies and counterproductive measures that will serve only as temporary plugs in the fractures of a dam that’s about to burst. To continue the metaphor, this embankment can be said to be our current way of living, thinking, and being, while the 21st century changes are the flowing and growing currents that threaten the integrity of the whole foundation.
In Thank You for Being Late, Thomas Friedman tackles these inexplicable changes head-on. He argues that humans’ ability to adapt and cope is being beaten out by a “supernova” predicated on a trio of technology, the market, and climate change, which have put us squarely in an “Age of Accelerations.”
Moore’s Law—that computing processor speeds double every two years—is the key here, for technology. Gordon Moore, an Intel co-founder for whom the law is named after, couldn’t have known how accurate his conjecturing would be: computers are much higher performing, more energy efficient, and cheaper just four decades and change later.
This sort of hyper-growth is happening with all technology. As of 2015, there were about 4.9 billion devices connected to the Internet of Things (IoT), a term that refers to the interconnection of common objects—cars, washers, and other smart devices—with each other and the Internet. By 2020, that number will be 12.2 billion, meaning there will be more internet-connected devices than humans.
The most profound disruption will come at the hands of what seems futuristic now, but will be real soon enough. Artificial intelligence will outpace all of mankind’s millennia-long maturity within this century. At this point, the hypothesis goes, computers will become so super-intelligent that we cannot comprehend what sort of implications it will carry for human civilization. Stephen Hawking is not upbeat about what it means for us. Nor are Elon Musk or Bill Gates; the latter ominously warning: “I don’t understand why some people are not concerned.”
And this is to make nothing of the near-term. When we marvel over the newest self-driving cars and über-smart refrigerators, we forget what this means for our livelihoods. “They’re taking our jobs!” a central animus that propelled the candidacy of Donald Trump is a straw man. Automation was the central reason for a decline in U.S. manufacturing jobs, not immigration or trade. Even more highbrow professions—doctors, lawyers, and accountants—are in the crosshairs of technology.
The fire and brimstone and doom and gloom of this column evoke images from Roland Emmerich’s 2012 or the android dystopia that Will Smith takes on in I, Robot. But as Friedman also notes, these trends can offset and upend each other for good. For example, technology has extended lifespans and allowed mankind to bring global poverty levels to unprecedented lows, despite climate change and a climbing world population.
Furthermore, robots could step in to pick up some of the slack left behind by the hollowing out of our work force. Futurists and many in the technology industry are optimistic that ‘Singularity’,when computer smarts best human smarts, can be harnessed for good.
What’s certain is that these changes will bring disruption and displacement. They will test humans’ adaptability, especially since we cannot necessarily envision or conceptualize what they mean at this point in time. Despite the pretensions of some policymakers, a flip of the switch cannot put us on a course reversal. We are on autopilot now, along for the ride.
Featured Image by Meg Dolan / Heights Editor