Posts tagged ‘Actual Temperatures’

Denying the Climate Catastrophe: 8. The Lukewarmer Middle Ground

This is Chapter 8 of an ongoing series.  Other parts of the series are here:

  1. Introduction
  2. Greenhouse Gas Theory
  3. Feedbacks
  4.  A)  Actual Temperature Data;  B) Problems with the Surface Temperature Record
  5. Attribution of Past Warming:  A) Arguments for it being Man-Made; B) Natural Attribution
  6. Climate Models vs. Actual Temperatures
  7. Are We Already Seeing Climate Change
  8. The Lukewarmer Middle Ground (this article)
  9. A Low-Cost Insurance Policy

In this chapter we are going to try to sum up where we are and return to our very first chapter, when I said that we would find something odd once we returned to the supposed global warming "consensus".

First, let's return to our framework one last time and try to summarize what has been said:

Slide75

I believe that this is a pretty fair representation of the median luke-warmer position.  Summarized, it would be:

  • Manmade CO2 warms the Earth, though by much less than most climate models claim because these models are assuming unrealistic levels of positive feedback that over-state future warming.  One degree C of warming, rather than four or five, is a more realistic projection of man-made warming over the next century
  • The world has certainly warmed over the last century, though by perhaps a bit less than the 0.8C in the surface temperature record due to uncorrected flaws in that record
  • Perhaps half of this past warming is due to man, the rest due to natural variability
  • There is little evidence that weather patterns are "already changing" in any measurable way from man-made warming

The statements I just wrote above, no matter how reasonable, are enough to get me and many others vilified as "deniers".  You might think that I am exaggerating -- that the denier tag is saved for folks who absolutely deny any warming effect of CO2.  But that is not the case, I can assure you from long personal experience.

The Climate Bait and Switch

Of course, the very act of attempting to shut people up who disagree with one's position on a scientific issue is, I would have thought, obviously anti-science.   The history of science is strewn with examples of the majority being totally wrong.   Even into the 1960's, for example, the 97% consensus in geology was that the continents don't move and that the few scientists who advocated for plate tectonics theory were crackpots.

But that is not how things work today.  Climate action advocates routinely look for ways to silence climate skeptics, up to and including seeking to prosecute these climate heretics and try to throw them in jail.

The reason that alarmists say they feel confident in vilifying and attempting to silence folks like myself is because they claim that the science is settled, that 97% of climate scientists believe in the consensus, and so everyone who is not on board with the consensus needs to shut up.  But what exactly is this consensus?

The 97% number first appeared in a "study" by several academics who sent out a survey to scientists with some climate change questions.  They recieved over 3146 responses, but they decided that only 77 of these respondents "counted" as climate scientists, and of these 75 of the 77 (97%) answered two questions about climate change in the affirmative.

We will get to the two questions in a second, but note already the odd study methodology.  If the other 10,000 plus people sent the survey were not the targets of the survey, why were they sent a survey in the first place?  It makes one suspicious that the study methodology was changed mid-stream to get the answer they wanted.

Anyway, what is even more fascinating is the two questions asked in the survey.  Here they are:

  1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?
  2. Do you think human activity is a significant contributing factor in changing mean global temperatures?

The 97% in this survey answered the questions "risen" and "yes".

Do you see the irony here?  If you have been following along with this series, you should be able to say how I would have answered the two questions.  I would certainly have said "risen" to 1.  The answer to question 2 is a bit hard because "significant" is not defined, but in a complex system with literally thousands of variables, I would have said one of those variables was a significant contributor at anything over about 10%.  Since I estimated man's effect on past warming around 40-50%, I would have answered "yes" to #2!  In fact, most every prominent science-based skeptic I can think of would likely have answered the same.

So you heard it right -- I and many prominent skeptics are part of the 97% consensus.  Effectively, I am being told to shut up and not continue to say what I think, in the name of a 97% consensus that represents exactly what I am saying.  This is so weird as to be almost Kafka-esque.

This is what I call the climate bait and switch.  Shaky assumptions about things like high positive feedback assumptions are defended with the near-certainty that surrounds unrelated proposition such as the operation of the greenhouse gas effect.

In fact, merely arguing about whether man-made warming exists or is "significant" falls well short of what we really need in the public policy arena.  What we really should be discussing is a proposition like this:

Is manmade CO2 causing catastrophic increases in warming and warming-driven weather effects whose costs exceed those of reducing CO2 production enough to avoid these effects?

It is about at this point when I usually have people bring up the precautionary principle.  So that I am not unfair to proponents of that principle, I will use the Wikipedia definition:

if an action or policy has a suspected risk of causing harm to the public, or to the environment, in the absence of scientific consensus (that the action or policy is not harmful), the burden of proof that it is not harmful falls on those taking an action that may or may not be a risk.

The principle is used by policy makers to justify discretionary decisions in situations where there is the possibility of harm from making a certain decision (e.g. taking a particular course of action) when extensive scientific knowledge on the matter is lacking. The principle implies that there is a social responsibility to protect the public from exposure to harm, when scientific investigation has found a plausible risk. These protections can be relaxed only if further scientific findings emerge that provide sound evidence that no harm will result.

I believe that, as stated, this is utter madness.  I will give you an example.   Consider a vaccine that saves thousands of lives a year.  Let's say, as is typical of almost every vaccine, that it also hurts a few people, such that it may kill 1 person for every thousand it saves.  By the precautionary principle as stated, we would never have approved any vaccine, because the precautionary principle does not put any weight on avoided costs of the action.

So take fossil fuel burning.   Proponents of taking drastic action to curb fossil fuel use in the name of global warming prevention will argue that until there is an absolute consensus that burning fossil fuels is not harmful to the climate, such burning should be banned.  But it ignores the substantial, staggering, unbelievably-positive effects we have gained from fossil fuels and the technology and economy they support.

Just remember back to that corn yield chart.

Bill McKibbon wants us to stop using fossil fuels because they may cause warmer temperatures that might reduce corn yields.  But there is a near absolute certainty that dismantling the fossil fuel economy will take us back to the horrendous yields in the yellow years on this chart.  Proponents of climate action point to the possibility of warming-based problems, but miss the near certainty of problems from elimination of fossil fuels.

Over the last 30 years, something unprecedented in the history of human civilization has occurred -- an astounding number of people have exited absolute poverty.

Folks like McKibbon act like there is no downside to drastically cutting back on fossil fuel use and switching to substantially more expensive and less convenient fuels, as if protecting Exxon's profits are the only reason anyone would possibly oppose such a measure.  But the billion or so people who have exited poverty of late have done so by burning every bit of fossil fuels than can obtain, and never would have been able to do so in such numbers had such an inexpensive fuel option not been available.  We in the West could likely afford having to pay $50 a month more for fuel, but what of the poor of the world?

Perhaps this will give one an idea of how central inexpensive fossil fuels are to well-being.  This is a chart from World Bank data plotting each country based on their per capital CO2 production and their lifespan.

Slide79

As you can see, there is a real, meaningful relationship between CO2 production and life expectancy.  In fact, each 10x increase in CO2 production is correlated with 10 years of additional life expectancy.  Of course, this relationship is not direct -- CO2 itself does not have health benefits (if one is not a plant).  But burning more CO2 is a byproduct of a growing technological economy, which leads to greater wealth and life expectancy.

The problem, then, is not that we shouldn't consider the future potential costs and risks of climate change, but that we shouldn't consider them in a vaccuum without also considering the costs of placing severe artificial limits on inexpensive fossil fuels.

Slide78

People often say to me that climate action is an insurance policy -- and they ask me, "you buy insurance, don't you?"   My answer invariably is, "yes, I buy insurance, but not when the cost of the policy is greater than the risk being insured against."

As it turns out, there is an approach we can take in this country to creating a low-cost insurance policy against the risks that temperature sensitivity to CO2 is higher than I have estimated in this series.  I will outline that plan in my final chapter.

Here is Chapter 9:  A Low-Cost Insurance Policy

Denying the Climate Catastrophe: 7. Are We Already Seeing Climate Change?

This is Chapter 7 of an ongoing series.  Other parts of the series are here:

  1. Introduction
  2. Greenhouse Gas Theory
  3. Feedbacks
  4.  A)  Actual Temperature Data;  B) Problems with the Surface Temperature Record
  5. Attribution of Past Warming:  A) Arguments for it being Man-Made; B) Natural Attribution
  6. Climate Models vs. Actual Temperatures
  7. Are We Already Seeing Climate Change (this article)
  8. The Lukewarmer Middle Ground
  9. A Low-Cost Insurance Policy

Note:  This is by far the longest chapter, and could have been 10x longer without a lot of aggressive editing.  I have chosen not to break it into two pieces.  Sorry for the length.  TL;DR:  The vast majority of claims of current climate impacts from CO2 are grossly exaggerated or even wholly unsupported by the actual data.  The average quality of published studies in this area is very low compared to other parts of climate science.

Having discussed the theory and reality of man-made warming, we move in this chapter to what is often called "climate change" -- is manmade warming already causing adverse changes in the climate?

click to enlarge

This is a peculiarly frustrating topic for a number of reasons.

First, everyone who discusses climate change automatically assumes the changes will be for the worse.  But are they?  The Medieval Warm Period, likely warmer than today, was a period of agricultural plenty and demographic expansion (at least in Europe) -- it was only the end of the warm period that brought catastrophe, in the form of famine and disease.  As the world warms, are longer growing seasons in the colder parts of the northern hemisphere really so bad, and why is it no one ever mentions such positive offsets?

The second frustrating issues is that folks increasingly talk about climate change as if it were a direct result of CO2, e.g. CO2 is somehow directly worsening hurricanes.  This is in part just media sloppiness, but it has also been an explicit strategy, re-branding global warming as climate change during the last 20 years when global temperatures were mostly flat.  So it is important to make this point:  There is absolutely no mechanism that has been suggested by anyone wherein CO2 can cause climate change except through the intermediate step of warming.  CO2 causes warming, which then potentially leads to changes in weather.  If CO2 is only causing incremental warming, then it likely is only causing incremental changes to other aspects of the climate.   (I will note as an aside that man certainly has changed the climate through mechanisms other than CO2, but we will not discuss these.  A great example is land use.  Al Gore claimed the snows of Kilimanjaro are melting because of global warming, but in fact it is far more likely they are receding due to precipitation changes as a result of deforestation of Kilimanjaro's slopes.)

Finally, and perhaps the most frustrating issue, is that handling claims of various purported man-made changes to the climate has become an endless game of "wack-a-mole".  It is almost impossible to keep up with the myriad claims of things that are changing (always for the worse) due to CO2.  One reason that has been suggested for this endless proliferation of dire predictions is that if one wants to study the mating habits of the ocelot, one may have trouble getting funding, but funding is available in large quantities if you re-brand your study as the effect of climate change on the mating habits of the ocelot.  It is the unusual weather event or natural phenomenon (Zika virus!) that is not blamed by someone somewhere on man-made climate change.

As a result, this section could be near-infinitely long.  To avoid that, and to avoid a quickly tedious series of charts labelled "hurricanes not up", "tornadoes not up", etc., I want to focus more on the systematic errors that lead to the false impression that we are seeing man-made climate changes all around us.

click to enlarge

We will start with publication bias, which I would define as having a trend in the reporting of a type of an event mistaken for a trend in the underlying events itself.  Let's start with a classic example from outside climate, the "summer of the shark".

The media hysteria began in early July, 2001, when a young boy was bitten by a shark on a beach in Florida.  Subsequent attacks received breathless media coverage, up to and including near-nightly footage from TV helicopters of swimming sharks.  Until the 9/11 attacks, sharks were the third biggest story of the year as measured by the time dedicated to it on the three major broadcast networks’ news shows.

Through this coverage, Americans were left with a strong impression that something unusual was happening — that an unprecedented number of shark attacks were occurring in that year, and the media dedicated endless coverage to speculation by various “experts” as to the cause of this sharp increase in attacks.

Except there was one problem — there was no sharp increase in attacks. In the year 2001, five people died in 76 shark attacks. However, just a year earlier, 12 people had died in 85 attacks. The data showed that 2001 actually was a down year for shark attacks.  The increased media coverage of shark attacks was mistaken for an increase in shark attacks themselves.

Hopefully the parallel with climate reporting is obvious.  Whereas a heat wave in Moscow was likely local news only 30 years ago, now it is an international story that is tied, in every broadcast, to climate change.  Every single tail-of-the-distribution weather event from around the world is breathlessly reported, leaving the impression among viewers that more such events are occurring, even when there is in fact no such trend.   Further, since weather events can drive media ratings, there is  an incentive to make them seem scarier:

When I grew up, winter storms were never named.  It was just more snow in Buffalo, or wherever.  Now, though, we get "Winter Storm Saturn: East Coast Beast."  Is the weather really getting scarier, or just the reporting?

click to enlarge

The second systematic error is not limited to climate, and is so common I actually have a category on my blog called "trend that is not a trend".   There is a certain chutzpah involved in claiming a trend when it actually does not exist in the data, but such claims occur all the time.  In climate, a frequent variation on this failure is the claiming of a trend from a single data point -- specifically, a tail-of-the-distribution weather event will be put forward as "proof" that climate is changing, ie that there is a trend to the worse somehow in the Earth's climate.

The classic example was probably just after Hurricane Katrina.  In a speech in September of 2005 in San Francisco, Al Gore told his Sierra Club audience that not only was Katrina undoubtedly caused by man-made global warming, but that it was the harbinger of a catastrophic onslaught of future such hurricanes.     In fact, though, there is no upward trend in Hurricane activity.   2005 was a high but not unprecedented year for hurricanes.  An Katrina was soon followed by a long and historic lull in North American hurricane activity.

Counting hurricane landfalls is a poor way to look at hurricanes.  A better way is to look at the total energy of hurricanes and cyclones globally.  And as you can see, the numbers are cyclical (as every long-time hurricane observer could have told Mr. Gore) but without any trend:

In fact, the death rates from severe weather have been dropping throughout the last century at the same time CO2 levels have been rising

Of course, it is likely that increasing wealth and better technology are responsible for much of this mitigation, rather than changes in underlying weather patterns, but this is still relevant to the debate -- many proposed CO2 abatement plans would have the effect of slowing growth in the developing world, leaving them more vulnerable to weather events.   I have argued for years that the best way to fight weather deaths is to make the world rich, not to worry about 1 hurricane more or less.

Droughts are another event where the media quickly finds someone to blame the event on man-made climate change and declare that this one event is proof of a trend.  Bill McKibben tweeted about drought and corn yields many times in 2012, for example:

It turns out that based on US government data, the 2012 drought was certainly severe but no worse than several other droughts of the last 50 years (negative numbers represent drought).

There is no upward trend at all (in fact a slightly downward trend that likely is not statistically significant) in dry weather in the US

McKibben blamed bad corn yields in 2012 on man-made global warming, and again implied that one year's data point was indicative of a trend

US corn yields indeed were down in 2012, but still higher than at any time they had been since 1995.

It is worth noting the strong upward trend in corn yields from 1940 to today, at the same time the world has supposedly experienced unprecedented man-made warming.   I might also point out the years in yellow, which were grown prior to the strong automation of farming via the fossil fuel economy.  Bill McKibben hates fossil fuels, and believes they should be entirely eliminated.  If so, he also must "own" the corn yields in yellow.  CO2-driven warming has not inhibited corn yields, but having McKibben return us to a pre-modern economy certainly would.

Anyway, as you might expect, corn yields after 2012 return right back to trend and continue to hit new records.  2012 did not represent a new trend, it was simply one bad year.

I think most folks would absolutely swear, from media coverage, that the US is seeing more new high temperatures set and an upward trend in heat waves.  But it turns out neither is the case.

Obviously, one has to be careful with this analysis.  Many temperature stations in the US Historical Climate Network have only been there for  20 or 30 years, so their all time high at that station for any given day is, by definition, going to be in the last 20 or 30 years.  But if one looks at temperature stations with many years of data, as done above, we can see there has been no particular uptick in high temperature records and in fact a disproportionate number of our all-time local records were set in the 1930's.

While there has been a small uptick in heat waves over the last 10-20 years, it is trivial compared to the heat of the 1930's

Looking at it a different way, there is no upward trend in 100 degree (Fahrenheit) days...

Or even 110 degree days.  Again, the 1930's were hot, long before man-made CO2 could possibly have made them so

Why, one might ask, don't higher average global temperatures translate into more day-time high temperature records?  Well, we actually gave the answer back in Chapter 4A, but as a reminder, much of the warming we have seen has occurred at night, raising the nighttime lows without as much affect on daytime highs, so we are seeing more record nighttime high Tmin's than we have in much of the last century without seeing more record daytime Tmax temperatures:

We could go on all day with examples of claiming a trend from a single data point.  Watch for it yourself.  But for now let's turn to a third category

We can measure things much more carefully and accurately than we could in the past.  This is a good thing, except when we are trying to compare the past to the present.  In a previous chapter, we showed a count of sunspots, and databases of sunspot counts go all the way back into the early 18th century.  Were telescopes in 1716 able to see all the sunspots we can see in 2016?  Or might an upward trend in sunspot counts be biased by our better ability today to detect small ones?

A great example of this comes, again, from Al Gore's movie in which Gore claimed that tornadoes were increasing and man-made global warming was the cause.  He was working with this data:

This certainly looks scary.  Tornadoes have increased by a factor of 5 or 6!  But if you look at the NOAA web site, right under this chart, there is a big warning that ways to beware of this data.  With doppler radar and storm chasers and all kinds of other new measurement technologies, we can detect smaller tornadoes that were not counted in the 1950's.  The NOAA is careful to explain that this chart is biased by changes in measurement technology.  If one looks only at larger tornadoes we were unlikely to miss in the 1950's, there is no upward trend, and in fact there may be a slightly declining trend.

That, of course, does not stop nearly every person in the media from blaming global warming whenever there is an above-average tornado year

Behind nearly every media story about "abnormal" weather or that the climate is somehow "broken" is an explicit assumption that we know what "normal" is.  Do we?

We have been keeping systematic weather records for perhaps 150 years, and have really been observing the climate in detail for perhaps 30 years.  Many of our best tools are space-based and obviously only have 20-30 years of data at most.  Almost no one thinks we have been able to observe climate in depth through many of its natural cycles, so how do we know exactly what is normal?  Which year do we point to and say, "that was the normal year, that was the benchmark"?

One good example of this is glaciers.  Over the last 30 years, most (but not all) major glaciers around the world have retreated, leading to numerous stories blaming this retreat on man-made warming.  But one reason that glaciers have retreated over the last 50 years is that they were also retreating the 50 years before that and the 50 years before that:

In fact, glaciers have been retreating around the world since the end of the Little Ice Age (I like to date it to 1812, with visions of Napoleon's army freezing in Russia, but that is of course arbitrary).

A while ago President Obama stood in front of an Alaskan glacier and blamed its retreat on man.  But at least one Alaskan glacier in the area has been mapped for centuries, and has been retreating for centuries:

As you can see, from a distance perspective, most of the retreat actually occurred before 1900.  If one wants to blame the modern retreat of these glaciers on man, one is left with the uncomfortable argument that natural forces drove the retreat until about 1950, at which point the natural forces stopped just in time for man-made effects to take over.

Melting ice is often linked to sea level rise, though interestingly net ice melting contributes little to IPCC forecasts of sea level rises due to expected offsets with ice building in Antarctica -- most forecast sea level rise comes from the thermal expansion of water in the oceans.  And of course, the melting arctic sea ice that makes the news so often contributes nothing to sea level rise (which is why your water does not overflow your glass when the ice melts).

But the story for rising sea levels is the same as with glacier retreats -- the seas have been rising for much longer than man has been burning fossil fuels in earnest, going back to about the same 1812 start point:

Slide132

There is some debate about manual corrections added to more recent data (that should sound familiar to those reading this whole series) but recent sea level rise seems to be no more than 3 mm per year.  At most, recent warming has added perhaps 1 mm a year to the natural trend, or about 4 inches a century.

Our last failure mode is again one I see much more widely than just in climate.  Whether the realm is economics or climate or human behavior, the media loves to claim that incredibly complex, multi-variable systems are in fact driven by a single variable, and -- who'd have thunk it -- that single variable happens to fit with their personal pet theory.

With all the vast complexity of the climate, are we really to believe that every unusual weather event is caused by a 0.013 percentage point change (270 ppm to 400 ppm) in the concentration of one atmospheric gas?

Let me illustrate this in another way.  The NOAA not only publishes a temperature anomaly (which we have mostly been using in all of our charts) but they take a shot at coming up with an average temperature for the US.   The following chart uses their data for the monthly average of Tmax (the daily high at all locations), Tmin (the daily low for all locations) and Tavg (generally the average of Tmin and Tmax).

 

Note that even the average temperatures vary across a range of 40F through the seasons and years.  If one includes the daily high and low, the temperatures vary over a range of nearly 70F.  And note that this is the average for all the US over a month.  If we were to look at the range of daily temperatures across the breath of locations, we would see numbers that varied from well into the negative numbers to over 110.

The point of all this is that temperatures vary naturally a lot.  Now look at the dotted black line.  That is the long-term trend in the average, trending slightly up (since we know that average temperatures have risen over the last century).  The slope of that line, around 1F per century for the US, is virtually indistinguishable.   It is tiny, tiny, tiny compared to the natural variation of the averages.

The point of this is not that small increases in the average don't matter, but that it is irrational to blame every tail-of-the-distribution temperature event on man-made warming, since no matter how large we decide that number has been, its trivial compared to the natural variation we see in temperatures.

OK, I know that was long, but this section was actually pretty aggressively edited even to get it this short.  For God sakes, we didn't even mention polar bears (the animals that have already survived through several ice-free inter-glacial periods but will supposedly die if we melt too much ice today).  But its time to start driving towards a conclusion, which we will do in our next chapter.

Chapter 8, summarizing the lukewarmer middle ground, is here.