The Corner

Where Deficits Come From

I’ve been writing and talking a fair amount about President Eisenhower lately, and the question of Eisenhower-era tax rates always comes up, as indeed it did today with a claim from a lightly informed commenter. 

In brief:

Assuming you did not take any of the ample opportunities to game the Eisenhower-era system, people with very high incomes did indeed pay a higher effective tax rate in the 1950s than they would today. (Interesting tax simulator here.) At $100,000 a year, the 2012 effective federal rate (about 21 percent) is not radically different from the 1955 effective rate (about 28 percent) on the inflation-adjusted equivalent income, but it is significantly higher. At $10 million in 2012 dollars, you see a large difference, 35 percent vs. 90 percent. Strangely, when Ike got his million-dollar payday for his memoirs (around $10 million in real terms), he did not pay the 90 percent rate for which he might have been theoretically liable; like many high earners of the era, he saw to it that his windfall was taken as a capital gain. Similarly, if you jacked up income-tax rates to 70 or 90 percent today, you can be confident that people earning $100 million would find a way to restructure their income as capital gains, or become overseas-domiciled corporations or, like Tina Turner and a record number of Americans last year, become citizens of other countries. (Tina Turner, American icon, is a Swiss national.)

If you think high top tax rates are a good thing in and of themselves, you should make that argument, but don’t pretend that lower statutory rates are why we have deficits today.  

To get a better idea of how much we were and are paying in taxes, forget the statutory top rate, which ignores a million variables — among them, the fact that only 10,000 taxpayers out of 46 million in 1958 paid the 91 percent top rate or the 81 percent second highest rate – and look instead at the actual revenue collected as a share of GDP. In 1955 and in 2013, it was pretty close to the same, about 16.5 percent in 1955 vs. an expected 17 percent in 2013, according to the CBO. (In case you read that too quickly: We’re collecting more revenue today as a share of GDP than we did when the top rate was 90 percent.) The difference is that federal spending was about 17 percent of GDP in 1955 (small deficit) but will be about 21 percent of GDP in 2013 (large deficit). Our spending habits were not exactly spartan in the 1950s — the Cold War military buildup and the growing welfare state ensured that spending was far higher than it had been in the pre-war era, though lower than during the war years themselves.

Which is to say, if we had Eisenhower-era revenue today but 2013 levels of spending, then we could expect a deficit of about $720 billion, about what the White House is expecting this year. Similarly, the Reagan-era deficits corresponded with relatively high levels of spending (by Democratic Congresses) rather than with remarkably low levels of tax collection. The deficit in 1983, the largest of the Reagan years, was 6 percent of GDP. Spending was 23.5 percent and taxes were 17.5 percent. Which is not to say revenue variations do not matter: The 2000 surplus was a result of the fact that the Gingrich Congress cut spending from 21 percent of GDP in 1994 to 18.2 percent in 2000, while tax collections rose from 18 percent to 20.6 percent. Those higher tax collections were the result of many factors, but a 90 percent top rate was not among them.

Conclusion: The most important variable in play is not the top tax rate, but the level of spending. 

Kevin D. Williamson is a former fellow at National Review Institute and a former roving correspondent for National Review.
Exit mobile version