Google+
Close
A Wicked Orthodoxy
Global-warming alarmism is not merely irrational.


Text  


Comments
691

There is, indeed, an accepted scientific theory that I do not dispute and that, the alarmists claim, justifies their belief and their alarm.

This is the so-called greenhouse effect: the fact that the earth’s atmosphere contains so-called greenhouse gases (of which water vapor is overwhelmingly the most important, but carbon dioxide is another) that, in effect, trap some of the heat we receive from the sun and prevent it from bouncing back into space. 

Without the greenhouse effect, the planet would be so cold as to be uninhabitable. But, by burning fossil fuels — coal, oil, and gas — we are increasing the amount of carbon dioxide in the atmosphere and thus, other things being equal, increasing the earth’s temperature.

Advertisement
But four questions immediately arise, all of which need to be addressed, coolly and rationally.

First, other things being equal, how much can increased atmospheric CO2 be expected to warm the earth? (This is known to scientists as climate sensitivity, or sometimes the climate sensitivity of carbon.) This is highly uncertain, not least because clouds have an important role to play, and the science of clouds is little understood. Until recently, the majority opinion among climate scientists had been that clouds greatly amplify the basic greenhouse effect. But there is a significant minority, including some of the most eminent climate scientists, who strongly dispute this.

Second, are other things equal, anyway? We know that, over millennia, the temperature of the earth has varied a great deal, long before the arrival of fossil fuels. To take only the past thousand years, a thousand years ago we were benefiting from the so-called medieval warm period, when temperatures are thought to have been at least as warm, if not warmer, than they are today. And during the Baroque era we were grimly suffering the cold of the so-called Little Ice Age, when the Thames frequently froze in winter and substantial ice fairs were held on it, which have been immortalized in contemporary prints.

Third, even if the earth were to warm, so far from this necessarily being a cause for alarm, does it matter? It would, after all, be surprising if the planet were on a happy but precarious temperature knife-edge, from which any change in either direction would be a major disaster. In fact, we know that, if there were to be any future warming (and for the reasons already given, “if” is correct) there would be both benefits and what the economists call disbenefits. I shall discuss later where the balance might lie.

And fourth, to the extent that there is a problem, what should we, calmly and rationally, do about it?

It is probably best to take the first two questions together.

According to the temperature records kept by the UK Met Office (and other series are much the same), over the past 150 years (that is, from the very beginnings of the Industrial Revolution), mean global temperature has increased by a little under a degree centigrade — according to the Met Office, 0.8 degrees Celsius. This has happened in fits and starts, which are not fully understood. To begin with, to the extent that anyone noticed it, it was seen as a welcome and natural recovery from the rigors of the Little Ice Age. But the great bulk of it — 0.5 degrees Celsius out of the 0.8 degrees Celsius — occurred during the last quarter of the 20th century. It was then that global-warming alarmism was born. 

But since then, and wholly contrary to the expectations of the overwhelming majority of climate scientists, who confidently predicted that global warming would not merely continue but would accelerate, given the unprecedented growth of global carbon emissions, as China’s coal-based economy has grown by leaps and bounds, there has been no further warming at all. To be precise, the latest report of the Intergovernmental Panel on Climate Change (IPCC), a deeply flawed body whose non-scientist chairman is a committed climate alarmist, reckons that global warming has latterly been occurring at the rate of — wait for it — 0.05 degrees Celsius per decade, plus or minus 0.1 degrees Celsius. Their figures, not mine. In other words, the observed rate of warming is less than the margin of error.

And that margin of error, it must be said, is implausibly small. After all, calculating mean global temperature from the records of weather stations and maritime observations around the world, of varying quality, is a pretty heroic task in the first place. Not to mention the fact that there is a considerable difference between daytime and night-time temperatures. In any event, to produce a figure accurate to hundredths of a degree is palpably absurd.

The lessons of the unpredicted 15-year global temperature standstill (or hiatus as the IPCC calls it) are clear. In the first place, the so-called Integrated Assessment Models that the climate-science community uses to predict the global temperature increase that is likely to occur over the next 100 years are almost certainly mistaken, in that climate sensitivity is almost certainly significantly less than they once thought, and thus the models exaggerate the likely temperature rise over the next hundred years.

But the need for a rethink does not stop there. As the noted climate scientist Professor Judith Curry, chairwoman of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, recently observed in written testimony to the U.S. Senate:

Anthropogenic global warming is a proposed theory whose basic mechanism is well understood, but whose magnitude is highly uncertain. The growing evidence that climate models are too sensitive to CO2 has implications for the attribution of late-20th-century warming and projections of 21st-century climate. If the recent warming hiatus is caused by natural variability, then this raises the question as to what extent the warming between 1975 and 2000 can also be explained by natural climate variability.

It is true that most members of the climate-science establishment are reluctant to accept this, and argue that the missing heat has for the time being gone into the (very cold) ocean depths, only to be released later. This is, however, highly conjectural. Assessing the mean global temperature of the ocean depths is — unsurprisingly — even less reliable, by a long way, than the surface temperature record. And in any event most scientists reckon that it will take thousands of years for this “missing heat” to be released to the surface.

In short, the CO2 effect on the earth’s temperature is probably less than was previously thought, and other things — that is, natural variability and possibly solar influences — are relatively more significant than has hitherto been assumed.



Text  


Sign up for free NRO e-mails today:

NRO Polls on LockerDome

Subscribe to National Review