BrianGarst.com

Malo periculosam, libertatem quam quietam servitutem.

Sunday

29

November 2009

Climategate, Part II

Written by , Posted in Energy and the Environment

The next round of Climategate is upon us.  The Climategate emails exposed the unscientific behavior of those behind the so-called global warming consensus.  But the emails were only part of the story.  Also released was a wealth of data and computer programs, which require longer to analyze and digest.  Now they are starting to paint a similarly disturbing picture:

Since these documents are more technical than the emails, however, analysis has been slower in coming. And, as in the case of the emails, there’s unmistakable evidence of fudging and book-cooking, all designed to give the impression that the warming in the twentieth century is unprecedented. The evidence is all the more damning because of the expletive-laced complaints of programmers tasked with altering code to corral unruly, unreliable, and sometimes cherry-picked data in a pre-determined direction. At one point, a poor, exasperated programmer, “Harry,” bemoans “the hopeless state of our databases.” (See telling examples and good analysis of these code notes here, here, here, and here.)

Hiding and manipulating data and code are especially serious in climate science because, as Willis Eschenbach has pointed out, “unlike all other physical sciences, [climate science] does not study things—instead it studies averages . . . This is because climate by definition is the average of weather over a suitably long period of time (typically taken as a minimum of 30 years).” So without the background information, it’s almost impossible for other scientists to verify—or falsify—your results.

…We may just now be seeing the potential for this new way of transferring and analyzing information. In Memogate, remember, we were talking about a single one-page Word document. With Climategate, we’re dealing with thousands of detailed, often technical documents. They may even have been compiled internally at the CRU in response to a Freedom of Information request and were then leaked instead. So the revenge of the nerds could be especially brutal and prolonged. Already, insights and analyses are proliferating on the climate blogosphere so quickly that it’s becoming impossible for even the best consolidators to keep up.

On top of this comes the revelation that raw data used to construct the temperature record has been destroyed, and all the remains is the adjusted numbers.

SCIENTISTS at the University of East Anglia (UEA) have admitted throwing away much of the raw temperature data on which their predictions of global warming are based.

It means that other academics are not able to check basic calculations said to show a long-term rise in temperature over the past 150 years.

The UEA’s Climatic Research Unit (CRU) was forced to reveal the loss following requests for the data under Freedom of Information legislation.

Some manner of adjustment of the data is expected and part of the process, but without the raw data to compare it to, we can never verify the accuracy of the methodology used, or even whether it was a good faith effort to correct for other flaws or simply cynical manipulation of the data to achieve the desired result.  And on the cynical manipulation front, the evidence against the warmists is piling up.