Steve McIntyre, the statistician than called into question much of the methodology behind the Mann Hockey Stick chart, has some observations on adjustments to US temperature records I discussed here and here.
Eli Rabett and Tamino have both advocated faith-based climate
science in respect to USHCN and GISS adjustments. They say that the
climate "professionals" know what they're doing; yes, there are
problems with siting and many sites do not meet even minimal compliance
standards, but, just as Mann's "professional" software was able to
extract a climate signal from the North American tree ring data, so
Hansen's software is able to "fix" the defects in the surface sites.
"Faith-based" because they do not believe that Hansen has any
obligation to provide anything other than a cursory description of his
software or, for that matter, the software itself. But if they are
working with data that includes known bad data, then critical
examination of the adjustment software becomes integral to the
integrity of the record - as there is obviously little integrity in
much of the raw data.
While acolytes may call these guys "professionals", the process of
data adjustment is really a matter of statistics and even accounting.
In these fields, Hansen and Mann are not "professionals" - Mann
admitted this to the NAS panel explaining that he was "not a
statistician". As someone who has read their works closely, I do not
regard any of these people as "professional". Much of their reluctance
to provide source code for their methodology arises, in my opinion,
because the methods are essentially trivial and they derive a certain
satisfaction out of making things appear more complicated than they
are, a little like the Wizard of Oz. And like the Wizard of Oz, they
are not necessarily bad men, just not very good wizards.
He goes on to investigate a specific example the "professionals" use
as a positive example, demonstrating they appear to have a Y2K error in
their algorithm. This is difficult to do, because like Mann, government scientists maintaining a government temperature data base taken from government sites paid for with taxpayer funds refuse to release their methodology or algorithms for inspection.
In the case cited, the "professionals" also make adjustments that imply the site has
decreasing urbanization over the last 100 years, something I am not
sure one can say about any site in the US except perhaps for a few
Colorado ghost towns. The "experts" also fail to take the basic step of actually analyzing the site itself which, if visited, would reveal recently installed air conditioning unites venting hot air on the temperature instrument.
A rebuttal, arguing that poor siting of temperature instruments is OK and does not affect the results is here. I find rebuttals of this sort really distressing. I studied physics for a while, before switching to engineering, and really small procedural mistakes in measurement could easily invalidate one's results. I find it amazing that climate scientists seek to excuse massive mistakes in measurement. I'm sorry, but in no other branch of science are results considered "settled" when the experimental noise is greater than the signal. I would really, really, just for once, love to see a anthropogenic global warming promoter say "well, I don't think the siting will change the results, but you are right, we really need to go back and take another pass at correcting historical temperatures based on more detailed analysis of the individual sites."