Karl Marx would have welcomed the advent of our new robot overlords as a trigger for revolution, though one more upscale than he’d hoped for: A rising not of, or for, the working class, but by the well educated and ambitious, furious at being denied what they see as their fair share of the pie. The meek will never inherit the earth; clever people with a grudge just might.
To understand why “robots” — sexy, sinister shorthand for the increasing automation of work — might drive them to try, “elite overproduction” (a phrase coined by the University of Connecticut’s Peter Turchin) is an excellent place to start. To put it more crudely than Professor Turchin ever would, this occurs when members of the elite (or those with the talents to join it) become too numerous for society to accommodate their aspirations.
Turchin can stretch this concept too far, but he’s correct that it can be a useful indicator of trouble to come. Thus, as he noted in 2012, the Arab Spring was preceded by “a remarkable expansion of the numbers of university-educated youths without job prospects” — in other words, by elite overproduction.
According to Turchin, elite overproduction can cause such fierce competition within the elite that the old order risks being pulled apart. Perhaps that’s so, but there may be a simpler way to look at this. Oppressed masses generally stay oppressed. They may smolder, but it takes the bright to spark a revolution. And if the bright feel they are missing out, that’s what they will be tempted to do.
After the Arab Spring, Occupy: Many of its activists were young and university-educated (“elite aspirants,” in Turchin’s terminology) and enraged by the shambles that (as they saw it) greedy bankers had created, a shambles that threatened their chances of a comfortable future — not that they would have put it quite so selfishly. Even the name “Occupy” evoked a struggle for territory, a struggle that took physical form in places such as Manhattan’s Zuccotti Park, but the tent cities downtown were little more than metaphor. The Occupiers’ ambitions went beyond a scrap of real estate: They wanted to take over the political and economic space allegedly held by the “1 percent” they demonized so effectively. Stripped of the revolutionary rhetoric, this was a contest to define the next elite, a contest intended to move from the streets to the legislature — an option, of course, available in America’s democracy but elusive in Egypt.
Occupy’s demonstrations soon faded, but the ideas they represented live on, their persistence testimony to deeper fears about what lies ahead. The suspicion that the American economy is faltering — stagnant incomes, growing structural unemployment, and all the rest — is not new, but up to now those on the way up, or those who were already doing well, have reassured themselves that blue-collar woe was nothing to do with them. Joe Lunchbucket — that slowpoke — just hadn’t kept up. But that complacency is fading, and with reason. To be sure, the college-educated have an edge in the workplace, and that advantage has grown; but, as a benchmark, high school these days is a low bar. Over a third of 25- to 32-year-olds in 2013 had a bachelor’s degree (or above), up from one-eighth in 1965.
A degree is still a route to higher earnings, but it’s not a guarantee. The labor market is not Lake Wobegon: If a third of new entrants to the work force are university graduates, they won’t all be above average, especially those who attended one of academe’s less leafy groves. Their degrees will be the equivalent of the high-school diplomas of half a century ago, a ticket to the ballpark, not the VIP suite. For many graduates, gently shepherded through often undemanding schoolwork and gently burdened with a monstrous debt, dreams will turn into nightmares. There will be no place for them in the track to success. Their expectations were unrealistic, but their disappointment will be real. If their teachers haven’t already radicalized them, life may do the trick.
They will probably find work, but very possibly not of the type they were hoping for. The New York Fed concluded that in 2012 nearly half of all recent graduates were in jobs for which they were, in theory, overqualified. The lingering aftermath of the Great Recession hasn’t helped, but underemployment among recent graduates, the cohort that first Occupied and then felt the Bern, has been on a rising trend since 2000. The New York Fed recounted how “during the first decade of the 2000s, many college graduates were forced to move down the occupational hierarchy to take jobs typically performed by lower-skilled workers.”
Rubbing salt into Millennial wounds, there’s more “under” nowadays in underemployment. The New York Fed divided “non-college” jobs into “good” (“career-oriented, relatively skilled, and fairly well compensated”) and “low-wage.” The share of those stuck in the latter, such as the college-educated barista of contemporary cliché, has risen. Put this all together and it looks a lot like elite overproduction, and the “gig economy,” a hipster euphemism for part-time piece-work, won’t fill the gap.
It’s not clear what will. The information-technology revolution, once seen as a cornucopia of new, well-paid employment, rolls on, and, as revolutions do, it is eating its own. For instance, many IT jobs have disappeared into the Cloud. In his terrifying Rise of the Robots (2015), Martin Ford tells how, thanks to Facebook’s Cyborg software, “a single technician [can] manage as many as 20,000 computers.” Ford points to a 2013 analysis by the Economic Policy Institute that showed that “the number of new graduates with engineering and computer science degrees exceeds the number of graduates who actually find jobs in these fields by 50 percent.” If education — that perpetual panacea — is no longer the answer, what is?
In Player Piano (1952), Kurt Vonnegut depicts an America in which most jobs have been automated away. The country is split between a large underclass and an elite made up of “managers and engineers and civil servants and a few professional people.” In one passage, a member of the elite explains “how the First Industrial Revolution devalued muscle work, then the second one devalued routine mental work.” He is asked whether there will be a third. He replies that it’s under way and that it involves “thinking machines . . . machines that devaluate human thinking . . . the real brainwork.” He doesn’t, he adds, want to be around to see where it will lead. He would not like the looks of 2016.
By replacing brain as well as brawn, technology is encroaching into ever more elevated areas of employment, menacing those who have good jobs as well as those who are merely searching for them. Real brainwork will be industrialized, subdivided into discrete parts that can be either performed more efficiently or, with the help of algorithms, automated altogether. For example, ask securities traders what this has meant for them. Despite the strong recovery in financial markets since the late unpleasantness, Wall Street employs fewer people than it did in 2007, and many of the jobs that remain are at risk. At its core, the business of finance is about the organization, manipulation, and exploitation of data, and that’s what software is for. As those algorithms increase in sophistication, they will substitute not only intelligence but judgment, vetting customers, spotting opportunities, managing portfolios. Wall Street culls used to be focused mainly on clerical staff; now the “front office” is sharing in much more of the pain.
Lawyers are facing a similar fate. Search engines have long since simplified the trudge through case law. Now other technologies are coming into play, ranging from the use of predictive coding to speed up the pre-discovery process by determining the relevance (or otherwise) of a particular document to the preparation of basic documentation to (soon) advising on the winnability of simple lawsuits. Unemployment among law-school graduates is bad enough as it is. Either it will get worse or the Paper Chase will have far fewer participants: Another gateway to the elite narrows.
And doctors shouldn’t feel smug. Ever more sophisticated data-sorting technology is already leading to more accurate diagnoses, and if it is not yet suggesting more effective treatments, it soon will be. Thus IBM’s Watson, a “cognitive system” that has long since moved on from its Jeopardy! triumph, has now branched out into areas that include medicine. IBM Watson Health, a smarter-than-Sherlock Doctor Watson, is, claims IBM, “pioneering a new partnership between humanity and technology with the goal of transforming global health.” Initially, such advances will deliver no more than an electronic — and unusually erudite — second opinion, but ultimately? And in the meantime, increasing reliance on technology will see a gradual de-skilling of a profession that has long ranked high in the social scale. A decline in pay will not be far behind.
If medicine, finance, and law, three great pillars of the modern elite, are coming under siege from the machines, it’s not unreasonable to ask how much room is going to be left at the top. An additional twist of the knife comes from communications technology. Not only will brainwork be industrialized, but much of it could easily be “exported” to telecommuters based in, say, China and India. Even the possibility that this might happen will drag the wages of the formerly valuable still farther down.
It’s no secret that inequality has widened throughout much of the West (and that automation has contributed to this). What’s less well known is how that inequality is sharpening at the top. In The Second Machine Age (2014), Erik Brynjolfsson and Andrew McAfee cite research showing that the top 5 percent took 80 percent of the increase in America’s wealth between 1983 and 2009, but the top 1 percent took “over half of that, and so on for ever-finer subdivisions of the wealth distribution.” The middle classes are trailing the upper middle classes, the upper middle classes are falling farther behind the rich, and the rich are lagging the very rich, a process that is likely to accelerate. This is more than a matter of technology eliminating or downgrading previously lucrative work. Technology also broadens access to the skills of the most talented. Their rewards rise. But it reduces demand for the services of the runners-up, the able but not quite able enough. Their rewards fall. TurboTax, for example, has enriched its creators, but has been rather less than splendid news for your local CPA.
Of course, new technology frequently requires significant capital investment. Much of the wealth it generates will go to those who can provide the cash. “For whosoever hath, to him shall be given,” as someone once said. And for “whosoever hath not, from him shall be taken away even that he hath.” The winner’s circle will shrink, leaving growing numbers of the talented stranded outside.
If the alarm bells are ringing, they are, so far, being heard by comparatively few. A 2015 survey by the Pew Research Center revealed that “65% of Americans expect that within 50 years robots and computers will ‘definitely’ or ‘probably’ do much of the work currently done by humans,” but “an even larger share (80%) expect that their own jobs or professions will remain largely unchanged.” Younger (18- to 29-year-old) Americans — iCocooned perhaps — are even more optimistic despite their deteriorating employment outlook, as are the better paid, and those working in the “government, education and nonprofit sectors.” They are all in for a nasty surprise, and in rather less than 50 years.
When Americans do finally grasp what automation is doing to their prospects, rage against the machines (or, more specifically, their consequences) will blend with existing discontent to form a highly inflammable mix. This broader economic unease is already spreading beyond left-behinds and Millennials, but when we reach the point where even those who are still doing well see robots sending proletarianization their way, there’s a decent chance that something akin to “middle-class panic” (a phenomenon identified by sociologist Theodor Geiger in, ominously, 1930s Germany) will ensue. Many of the best and brightest will face a stark loss of economic and social status, a blow that will sting far more than the humdrum hopelessness that many at the bottom of the pile have, sadly, long learned to accept. They will resist while they still have the clout to do so, and the media, filled with intelligent people who have already found themselves on the wrong side of technology, will have their back.
The endangered upper-middles will not only be talking to themselves. Tough times, and an acute awareness of how well those at the top are making out, have left the battered American working class open to a more radical rearrangement of the status quo. Technology is not solely to blame for what’s happening — far from it — but its capacity to disrupt the workplace is set to increase at an exponential rate. One Oxford study predicts that “about 47 percent of total U.S. employment is at risk” from technological change within the next couple of decades, an estimate that is less of an outlier than might be hoped. Both number and timetable have been challenged, but they give a clue about what may be at stake — and how soon. The implications aren’t pretty. Trump and Sanders may prove to be no more than rats in the coal mine.
Every revolution, whether at the polling station or on the street, needs foot soldiers drawn from the poor and the “left behind.” Still, it’s the leadership that counts. Add the impact of automation to the effects of existing elite overproduction and the result will be that the upheaval to come will be steered by a very large “officer class” — angry, effective, efficient, a “counter-elite” (to borrow another term from Turchin) looking to transform the social order of which, under happier circumstances, it would have been a mainstay.
Some people argue (correctly) that humanity has been able to weather earlier episodes of technological transformation and will do so again. But they need to rebut the argument that this metamorphosis — the replacement of “brain” — really is, as none other than Charles Murray has insisted, different. Past is not always prologue: Google, that colossus of our time, now employs more than 60,000 people worldwide, still considerably fewer than the 80,000 who worked for General Motors in or around Flint, Mich., alone, in the mid 1950s. Needless to say, Google now is not strictly comparable with Flint then (a techie is more than an updated assembly-line worker), but putting those two numbers side by side acts as a poignant reminder that today’s new technology-intensive businesses do not generate jobs in the numbers that the old manufacturers used to do.
It’s also worth adding that past technological transformations sometimes led to more lasting collateral damage than we now remember. We comfort ourselves with the knowledge that the Luddites were proved wrong, but we forget that proof of that was quite a while in coming. Economic historian Robert C. Allen refers to the decades that it took for real wages to rise in Britain after the technological changes of the early 19th century as “Engels’ Pause.” That’s the same Engels who argued in The Condition of the Working Class in England (1845) that the industrial revolution had made workers worse off. Over the long term, things changed for the better, but what happened in the interim should concern those worried about the political consequences of this latest technological revolution. These were the years not just of the Luddites, but also of the Peterloo Massacre, the Swing Riots, the Tolpuddle Martyrs, and the 1842 General Strike. By the time of the Chartists, a mass movement of the working class, an explicitly political agenda had evolved alongside struggles over pay. Engels took things even further. In 1848 he co-wrote The Communist Manifesto with Karl Marx, not an encouraging thought. The robots might one day deliver almost unlimited bounty, but the road to the Star Trek economy could be very rocky indeed.
We are on a conveyor belt to what Marx described as a “plastic moment,” when old assumptions crumble and everything is up for grabs. There will be no red flag over the White House, but, writes Martin Ford, “we are ultimately headed for a disruption that will demand a far more dramatic policy response.”
That “policy response,” shaped by the demands of that “surplus” elite, will be focused on a largely fruitless (but for a few, fruitful) “war against inequality” centered on a drastic redistributive effort. Taxes will rise steeply, on capital gains as well as income, and, given time, on the mere ownership of capital: We can expect a wealth tax on the living, a foretaste of death taxes to come.
Spending will doubtless soar, on infrastructure (occasionally even sensibly) and on retraining schemes for jobs that will never be. Health care will grow ever closer to single-payer. For the upper middle class squeezed by automation, reinvented as Robin Hoods on the make, all this will combine power play (the opportunity to redistribute away the gains of their more successful competitors) with marvelous career opportunities (someone has to operate the machinery of redistribution) and, of course, claims to the moral high ground.
In all probability, the politics of redistribution will also include ever noisier calls for a universal basic income (UBI), a guaranteed payment from the state to everyone. Finland will start testing a variant of this next year, although the reliably cautious Swiss recently rejected a version of UBI in a referendum in which the effect of technology on employment played a notable role in the debate. To be fair, UBI (with careful caveats) has its supporters on the right, from Friedrich Hayek to Charles Murray, with the latter citing the rise of the robots as part of his justification: “A UBI will be an essential part of the transition to [an] unprecedented world.”
Whatever the arguments in its favor, there’s an obvious danger that a UBI could shatter what’s left of the American ideal of self-help while handing immense and unhealthy power to a state on which too many will depend for too much. Who will fix the level at which the UBI is set? Who will decide who is to pay for it? Viewed from the right, the UBI may be nothing better than the price to be paid to maintain the peace, the lesser of two upheavals. Not every revolution needs blood in the streets.
At the same time, conservatives have to face the possibility that technology will build a world in which wealth will be ever more concentrated, most of the most talented will be cast aside, and unemployment lines will lengthen relentlessly, a dark trifecta that could trash social cohesion and take democracy down with it. Hoping for the best is not the way to head off catastrophe, nor is “standing athwart history.” As to what is, I simply don’t know.