So Where Is The Climate Science Money Actually Going If Not To Temperature Measurement?

You are likely aware that the US, and many other countries, are spending billions and billions of dollars on climate research.  After drug development, it probably has become the single most lucrative academic sector.

Let me ask a question.  If you were concerned (as you should be) about lead in soil and drinking water and how it might or might not be getting into the bloodstream of children, what would you spend money on?  Sure, better treatments and new technologies for filtering and cleaning up lead.  But wouldn't the number one investment be in more and better measurement of environmental and human lead concentrations, and how they might be changing over time?

So I suppose if one were worried about the global rise in temperatures, one would look at better and more complete measurement of these temperatures.  Hah!  You would be wrong.

There are three main global temperature histories: the combined CRU-Hadley record (HADCRU), the NASA-GISS (GISTEMP) record, and the NOAA record. All three global averages depend on the same underlying land data archive, the Global Historical Climatology Network (GHCN). Because of this reliance on GHCN, its quality deficiencies will constrain the quality of all derived products.

The number of weather stations providing data to GHCN plunged in 1990 and again in 2005. The sample size has fallen by over 75% from its peak in the early 1970s, and is now smaller than at any time since 1919.

Well, perhaps they have focused on culling a large poor quality network into fewer, higher quality locations?  If they have been doing this, there is little or no record of that being the case.  To outsiders, it looks like stations just keep turning off.   And in fact, by certain metrics, the quality of the network is falling:

The collapse in sample size has increased the relative fraction of data coming from airports to about 50 percent (up from about 30 percent in the 1970s). It has also reduced the average latitude of source data and removed relatively more high-altitude monitoring sites.

Airports, located in the middle of urban centers by and large, are terrible temperature measurement points, subject to a variety of biases such as the urban heat island effect.  My son and I measured over 10 degrees Fahrenheit different between the Phoenix airport and the outlying countryside in an old school project.  Folks who compile the measurements claim that they have corrected for these biases, but many of us have reasons to doubt that (consider this example, where an obviously biased station was still showing in the corrected data as the #1 warming site in the country).  I understand why we have spent 30 years correcting screwed up biased stations because we need some stations with long histories and these are what we have (though many long lived stations have been allowed to expire), but why haven't we been building a new, better-sited network?

Ironically, there has been one major investment effort to improve temperature measurement, and that is through satellite measurements.  We now use satellites for official measures of cloud cover, sea ice extent, and sea level, but the global warming establishment has largely ignored satellite measurement of temperatures.  For example, James Hansen (Al Gore's mentor and often called the father of global warming) strongly defended 100+ year old surface temperature measurement technology over satellites.  Ironically, Hansen was head, for years, of NASA's Goddard Institute of Space Studies (GISS), so one wonders why he resisted space technology in this one particular area.  Cynics among us would argue that it is because satellites give the "wrong" answer, showing a slower warming rate than the heavily manually adjusted surface records.

  • J K

    I'm a pilot; I have noticed that the weather I see from large airports is almost always an exaggeration of what's going on from weather stations downtown or in suburbia: if it's warm, the airport is sweltering; if it's cold, the airport is frigid; the winds are almost always at least 5-10 knots more at the airport than anywhere else, a thunderstorm in suburbia will display as just "thunderstorm" regardless of severity but the same storm reported from the airport will often be reported as "heavy thunderstorm", etc. Not sure if the scientists are looking at the same data I am, though.

  • joshv

    There has been such an effort, it's called USCRN - https://www.ncdc.noaa.gov/crn/

  • Joel

    Well I surveyed 20 of about 23 stations in Georgia for Watts. I was his goto guy in my state. I saw they were crap and expected everything I'd ever seen was screwed up.

    But I was wrong. The network appears to be about right. See Berkley Earth

    I do hold the satellite record as the gold standard, but they both say about the same thing -- 0.8C in a hundred years.

    And I have the same conclusion for both -- big F*%ing deal.

  • Jim Collins

    I think that I might have mentioned this before. In 1990 I worked at a small airport. One of our duties was to take weather readings three times per day. The readings were written in ledger books. Once a month the secretary would copy the readings on to a form and mail it to NOAA. In 1991 they installed an AWOS station and we no longer had to take the readings. They threw away the ledger books that went back to 1929. I would love to have those books and plot the temperatures on a spreadsheet. The thermometer was kept in a white vented box that was on the grass and away from the pavement and buildings. There are pictures of that box in the same location in the 1920's. The thermometer was a mercury thermometer that had a stamping stating that it was manufactured in 1929.

  • Peabody

    How dare you question this!? The debate is over! The time for measurements is past! It is time for action! (Action is defined as massive transfers of wealth that will do little to actually change what is claimed the fight is for)

  • joe - the non climate scientis

    . "They threw away the ledger books that went back to 1929. I would love to have those books and plot the temperatures on a spreadsheet. "

    That is the data that should be used. Best I can tell, using the raw data, the warmest period in the US (only 2% of the world area) was during the 20's/30's, yet the adjusted data shows that time period to be much cooler than the last 20 years in the US.

  • 4Kx3

    I agree with Coyote; I do physics and am disgusted with "temperature anomalies".
    Apparently USCRN does not contain pressure or wind, so one cannot accurate calculate volumetric energy. https://aea26.wordpress.com/2017/02/23/atmospheric-energy/

  • An Inquirer

    I have looked at Berkley Earth, and I do not come to the same conclusion as you do. Berkley accepted the adjustments; so Berkley is going to end up with the same results. That does not mean that the adjustments are right. The adjusted temperature are not verified by observation of weather/climate phenomena. For example, when the Great Lakes had its record ice levels, and the actual temperatures showed a cold winter, the adjusted temperatures showed a "normal" winter. Several other examples exist.
    Berkley did a bait-and-switch, promising to base its conclusions on actual temperatures, but in the end, used the same adjustments.

  • johnmoore

    Ultimately, near surface air temperature is a lousy way to measure global heat budgets. Ocean temperatures are better. Last decade they started deploying the ARGO profiler buoys to measure ocean temperature in the upper 2000m of the ocean.

    Of course, when the surface temperature warming failed to appear during the hiatus, the community then concluded that the heat gain was real and went into the oceans. The ARGO data didn't show much of that, so they have concluded that it went deep down where it couldn't be measured. That conclusion was based - yes - on a model.

    Oh well.