Mathematical models, that is.
Over at the Prudent Bear, Martin Hutchinson makes an interesting analogy between the financial crisis and climate alarmism:
We’ve now lived through the same new disaster twice. Computer simulations, more or less universally adopted as the solution to a major problem, turned out to have been based on flawed assumptions and faulty data. As a result policy or markets became heavily skewed in an inappropriate direction. Wall Street’s risk managers and climate change scientists both acted as super-salesmen for a paradigm that turned out to be flawed. . . .
The possibility that excess carbon dioxide, through a “greenhouse effect” might cause a global rise in temperature is based on well-established chemistry and physics. Deniers of the possibility of global warming are thus being as irrational as the extreme eco-alarmists; global warming is indeed possible because of physical and chemical processes that are perfectly well understood, indeed fairly elementary.
The difficulty arises in estimating whether it is actually happening. The rise in temperatures so far observed is well within the level of “noise” in global temperatures over a period of a century or so, let alone the more extreme fluctuations that have taken place when the observation period is extended to millennia. It is thus necessary to match the very limited temperature data we have, stretching back no more than a century on a worldwide basis, with secondary observations of such things as tree rings and ice cores, synthesizing the result with a computer model of what is believed to be the carbon forcing process in order to predict the range of possible future warming effects.
This is of course a very similar process to that undertaken by Wall Street’s rating agencies and risk managers. Assumptions and simplifications are made, without which it would be impossible to construct a model. Then the model is matched up against a few years’ observations in real time, being “tweaked” as real data comes in that does not quite fit with it. By the time this has been done, careers have been invested in the model, institutions have been built around its predictions and eminent people have become enthralled by its results. It thus takes on the appearance of a scientific reality as solid as Newtonian mechanics.
The shakiness of the mathematics underlying the global warming “consensus” was highlighted by the recent “Climategate” e-mails and computer tapes. Like Wall Street risk managers, climate scientists pooh-poohed the obvious flaws in the assumptions underlying their mathematical models. Like Wall Street bankers, they asserted a consensus behind those models – in Wall Street’s case, to win from regulators a profitable loosening of their leverage limits; in climate scientists’ case, to persuade politicians to provide them with hugely profitable research opportunities and capital for their “new energy” start-ups. Like Wall Street traders, they rejected any modifications of the models that had served them well, and pushed those models to their outer limits, to trade ever more exotic derivatives, or to justify ever more alarmist predictions of climate change.
The denouement in both cases may also turn out to be similar. In Wall Street’s case, the faulty models have led to losses in the financial system totaling in excess of $1 trillion. In the climate scientists’ case, the precise degree of error in their assumptions is not yet apparent. It is only clear that dubious methods were used to cover up the flaws in their models and observations, and that the more extreme predictions (“6 degrees Celsius by 2100″) were made up out of whole cloth to justify gargantuan economy-destroying projects of government control.
There’s a decent case to be made for doing something to curb carbon-dioxide emissions, if only as an insurance policy. But when we try to work out how a high a premium we should be prepared to pay, it is vital to recognize what we don’t know as well as what we do.