For the last twelve years, and to some extent for the last quarter-century, the United States has been living under a regime of artificially low interest rates. The Federal Reserve’s policy of easy money has inflated asset prices and widened inequality. Most important, as has been the case with other countries that have done the same thing, it has caused productivity growth to languish, making us poorer and making our economic problems harder to solve. It is time for this policy to be reversed, and rates returned to their “natural” rate of about 2 percent above inflation.
Most supporters of a free economy would agree that price fixing is bad. They cite the example of Gosplan, the Soviet planning agency, whose activities caused huge shortages of goods and massive economic inefficiency. Yet when it comes to interest rates, the most important price in the economy, they support a system under which an unaccountable central bank sets rates at the level it wants, without reference to market supply and demand. Since the dollar lacks the fixed anchor it had under the gold standard, its value floats against other currencies and the central bank is free to set rates wherever it wants them.
Since high interest rates slow economic activity in the short-term and low rates stimulate it, a good number of central banks have a bias towards setting rates as low as possible. These days, in many countries, they will even distort the banking system so as to set interest rates below zero, and some are planning ways of pushing all payments into electronic mechanisms, thus removing the alternative of holding physical cash, which the populace does if rates are more than marginally negative.
In the United States, the move to ultra-low interest rates began with Alan Greenspan in February of 1995, when he abandoned a money-tightening phase and began to ease. Stock prices immediately began to soar and went on doing so. In December 1996, Greenspan decried the “irrational exuberance” in the market, but then lowered rates further. After 2000, President George W. Bush wanted to shorten the natural recession that followed the dot-com bubble, so Greenspan held interest rates artificially low, at the then-unprecedented level of 1 percent. The result was a gigantic housing bubble. After 2008 Ben Bernanke, a more committed low-rates man than Greenspan, inaugurated the policy of zero interest rates and asset purchases. We have been living under that regime ever since, except for a brief period in 2017-19 when Fed chairman Jay Powell attempted to move back to normal policies.
Artificially low interest rates cause inflation, as has been understood for many decades and was proved in the 1970s. Yet not all inflation appears in official price statistics. In 1995–2000, inflation went into stocks, which reached valuations never before dreamed of. In 2002–07, it went into housing. Since 2009, it has gone into assets of all kinds. To show the level of distortion: If the Dow Jones Industrials Index were to trade at the same level as in February 1995, adjusted for both consumer price inflation and economic growth, it would today trade around 11,000. The difference between that and the Dow ‘s current price of 27,000 reflects the effect of ultra-low interest rates over a quarter-century.
So far, it may seem that low rates have done little harm. Yes, real-estate and asset prices are much higher than in 1995, and that has made inequality much worse, but inflation remains modest and few other ill effects are immediately apparent. However, the negative consequences of artificially low interest rates are all too visible when you look at productivity growth. By keeping rates far below their natural level for a decade or more, central bankers have encouraged a mass of ill-advised investment. Not only does this suck up resources, but it starves the entrepreneurs and small businesses, who have less access to cheap capital than big businesses.
From the late 18th century to 2007, Western economies could rely on a steady, even increasing rate of productivity growth. At less than 1 percent a year in the decades before 1850, productivity gradually accelerated in the late 19th and early 20th century, reaching a high of 2.8 percent annually in the United States in the quarter-century before 1973. I suspect that the arrival of the EPA and the OSHA on the scene in the early 1970s at least contributed to a sharp downturn in the exuberant growth in U.S. productivity, but even it doesn’t explain the full decline to 1.8 percent growth over the 1973-2010 period.
Other countries increased their productivity at similar rates to the U.S. in 1990-2007. Britain did slightly better at 2 percent annually while Japan, despite its slump from 1990, did better still, increasing productivity at 2.3 percent annually. Thus, there was no sign of secular slowing in productivity growth except perhaps in the European Union, where annual productivity growth stood at 1.4 percent.
After 2007, the picture changed drastically. Following a couple of years recovering from the financial crisis, U.S. productivity growth in 2010-19 was a sluggish 0.9 percent annually. It fell to 0.6 percent in the zero-rate years to 2016, then an improved annual 1.4 percent in 2017–19, as interest rates were brought closer to normal. However, that beat everywhere else. Euro zone productivity growth in 2007-19 was 0.6 percent annually, and British productivity growth was even lower at 0.4 percent while Japanese productivity growth was an appalling minus 0.4 percent annually.
The lesson is clear. Apart from their unpleasant social effects, artificially low interest rates kill productivity growth and thus prevent the improvements in living standards on which we have come to rely. Far from keeping rates at zero until the end of 2022, as it has promised, the Fed must quickly raise rates to their historically normal level, some 2 percent above the rate of inflation. Meanwhile, as I wrote a few weeks ago, that rate of inflation may provide a nasty shock of its own.