Posts tagged ‘fed’

Emulating North Korea

I have little tolerance for enforced patriotism of any sort.  In fact, having loyalty oaths and singing songs and genuflecting to flags all seem more consistent with totalitarianism than the values of liberty that patriots are nominally trying to promote.  If I were rotting in a crappy Phoenix jail for being caught with marijuana or busted for driving while Mexican, I would be even less patriotic

Dozens of Arizona inmates will eat nothing but bread and water for at least seven days in the latest punishment by one of America's toughest sheriffs.

Maricopa County Sheriff Joe Arpaio handed down the sentence after inmates defaced American flags hung in each jail cell. He says the men tore the flags, wrote or stepped on them and threw them in the toilet.

The flags are part of a push for patriotism in county jail cells. Arpaio has ordered thatGod Bless America and the national anthem be played daily in every jail facility.

This isn't Arpaio's first controversial move. He made headlines for keeping thousands of inmates outdoors in repurposed military tents in weather that was hotter than 117 degrees. He also made male inmates wear pink underwear.

He banned smoking, coffee and movies in all jails. And he's even put his stamp on mealtime. Inmates are fed only twice a day, and he stopped serving salt and pepper – all to save taxpayers money, he says.

Forgetting the Fed -- Why a Recovery May Actually Increase Public Debt

Note:  I am not an expert on the Fed or the operation of the money supply.  Let me know if I am missing something fundamental below

Kevin Drum dredges up this chart from somewhere to supposedly demonstrate that only a little bit of spending cuts are needed to achieve fiscal stability.

Likely the numbers in this chart are a total crock - spending cuts over 10 years are never as large as the government forecasts and tax increases, particularly on the rich, seldom yield as much revenue as expected.

But leave those concerns aside.  What about the Fed?  The debt as a percent of GDP shown for 2012 in this chart is around 72%.  Though it is not labelled as such, this means that this chart is showing public, rather than total, government debt.  The difference is the amount of debt held by federal agencies.  Of late, this amount has been increasing rapidly as the Fed buys Federal debt with printed money.  Currently the total debt as a percent of GDP is something like 101%.

The Left likes to use the public debt number, both because it is lower and because it has been rising more slowly than total debt (due to the unprecedented growth of the Fed's balance sheet the last several years).  But if one insists on making 10-year forecasts of public debt rather than total debt, then one must also forecast Fed actions as part of the mix.

Specifically, the Fed almost certainly will have to start selling some of the debt on its books to the public when the economy starts to recover.  That, at least, is the theory as I understand it: when interest rates can't be lowered further, the Fed can apply further stimulus via quantitative easing, the expansion of the money supply achieved by buying US debt with printed money.  But the flip side of that theory is that when the economy starts to heat up, that debt has to be sold again, sopping up the excess money supply to avoid inflation.  In effect, this will increase the public debt relative to the total debt.

It is pretty clear that the authors of this chart have not assumed any selling of debt from the Fed balance sheet.  The Fed holds about $2 trillion in assets more than it held before the financial crisis, so that selling these into a recovery would increase the public debt as a percent of GDP by 12 points.  In fact, I don't know how they get the red line dropping like it does unless they assume the current QE goes on forever, ie that the FED continues to sop up a half trillion dollars or so of debt every year and takes it out of public hands.

This is incredibly unrealistic.  While a recovery will likely be the one thing that tends to slow the rise of total debt, it may well force the Fed to dump a lot of its balance sheet (and certainly end QE), leading to a rise in public debt.

Here is my prediction:  This is the last year that the Left will insist that public debt is the right number to look at (as opposed to total debt).  With a reversal in QE, as well as the reversal in Social Security cash flow, public debt will soon be rising faster than total debt, and the Left will begin to assure us that total debt rather than public debt is the right number to look at.

Sandy and Global Warming

The other day I linked my Forbes column that showed that there was no upward trend in global hurricane number and strength, the number of US hurricane strikes, or the number of October hurricanes.  Given these trends, anyone who wants to claim Sandy is proof of global warming is forced to extrapolate from a single data point.

Since I wrote that, Bob Tisdale had an interesting article on Sandy.  The theoretical link between global warming and more and stronger Atlantic hurricanes has not been fully proven, but the theory says that warmer waters will provide energy for more and larger storms (like Sandy).  Thus the theory is that global warming has heated up the waters through which hurricanes pass and that feed these hurricanes' strength.

Bob Tisdale took a look at the historical trends in sea surface temperatures in the area bounded by Sandy's storm track.  These are the temperature trends for the waters that fueled Sandy.  This is what he got:

If he has done the analysis right, this means there is no warming trend over the last 60+ years in the ocean waters that fed Sandy.  This means that the unusually warm seas that fed Sandy's growth were simply a random event, an outlier which appears from this chart to be unrelated to any long-term temperature trend.

Update:  I challenge you to find any article arguing that Sandy was caused by anthropogenic global warming that actually includes a long term trend chart (other than global temperatures) in the article.  The only one I have seen is a hurricane strike chart that is cut off in the 1950's (despite data that goes back over 100 years) because this is the only cherry-picked cut off point that delivers an upward trend.  If you find one, email me the link, I would like to see it.

Is There Not One Single Operations Engineer in the TSA?

I go nuts when I see a bad process.  It bothers me so much I had to stop going to the local bagel outlet because their process behind the counter was so frustratingly awful it made my teeth hurt  (take order here, walk all the way to other end to get bagel, walk all the way back to toaster, then cross back over to get spread, all while nobody is able to pay because the only cashier also seems to be the only one assigned to fulfilling complicated coffee orders).

Because of this, going through TSA screening makes me completely nuts.  Screening is a classic assembly line process with steps that include putting shoes in bin, putting toiletries in bin, putting laptop in bin, shoving bin through x-ray, walking through scanner, retrieving items from x-ray, putting on shoes, putting items back in luggage, stacking bins and returning them to the front.  In many airports, I have observed that the long lines for screening are due to a simple bottleneck that could easily be removed if anyone in the TSA actually cared about service performance.

For example, I was in the San Jose airport the other day.  They had a really large area in front of the scanners with really long tables leading to the x-ray.  I thought to myself that this was smart - give people plenty of time in the line to be organizing their stuff into bins so one of the key potential bottlenecks, the x-ray machine, is always fed with items and is never waiting.

But then I got to the end of the process.  The landing area for stuff out of the x-ray was incredibly short.  When just one person tries to put their shoes on while their bag was still on the line, the whole x-ray conveyor gets jammed.  In fact, when I was there, the x-ray guy had to sit and wait for long periods of time for the discharge end to clear, so he could x-ray more bags.  One might have blamed this on clueless passengers who held up the line trying to put on shoes when they should step out of line and find a bench, but there were just two tiny benches for five screening lines.  The only place to get your stuff organized and get dressed was at the discharge of the x-ray, guaranteeing the x-ray gets held up constantly.

I can almost picture what happened here, but since I don't fly to San Jose much I haven't observed it over time.  But I bet some well-meaning but clueless person thought he saw a bottleneck in the entry to the x-ray, shifted everything to dedicate a ton of space to the entry, and thus created an enormous new bottleneck at the back end.  This kind of thing is stupid.  We are, what, 11 years into this screening?  Can you imagine Texas Instruments tolerating such a mess on their calculator assembly line for 11 years?

Public vs. Private Privacy Threats

I am always fascinated by folks who fear private power but support continuing increases in public / government power.  For me there is no contest - public power is far more threatening.  This is not because I necesarily trust private corporations like Goldman Sachs or Exxon or Google more than I do public officials.  Its because I have much more avenues of redress to escape the clutches of private companies and/or to enforce accountability on them.  I trust the incentives faced by private actors and the accountability mechanisms in the marketplace far more than I trust those that apply to government.

Here is a good example.  First, Kevin Drum laments the end of privacy because Google has proposed a more intrusive privacy policy.  I am not particularly happy about the changes, but at the end of the day, I am comforted by two things.  One:  I can stop using Google services.  Sure, I use them a lot now, but I don't have to.  After all, I used to be a customer or user of AOL, Compuserve, the Source, Earthlink, and Netscape and managed to move on from those guys.  Second:  At the end of the day, the worst they are tying to do to me is sell me stuff.  You mean, instead of being bombarded by irrelevant ads I will be bombarded by slightly more relevant ads?  Short of attempts of outright fraud like identity theft, the legal uses of this data are limited.

Kevin Drum, who consistently has more faith in the state than in private actors, actually gets at the real problem in passing (my emphasis added)

And yet…I'm just not there yet. It's bad enough that Google can build up a massive and—if we're honest, slightly scary—profile of my activities, but it will be a lot worse when Google and Facebook and Procter & Gamble all get together to merge these profiles into a single uber-database and then sell it off for a fee to anyone with a product to hawk. Or any government agency that thinks this kind of information might be pretty handy.

The last part is key.  Because the worst P&G will do is try to sell you some Charmin.  The government, however, can throw you and jail and take all your property.  Time and again I see people complaining about private power, but at its core their argument really depends on the power of the state to inspire fear.  Michael Moore criticizes private enterprise in Capitalism:  A Love Story, but most of his vignettes actually boil down to private individuals manipulating state power.  In true free market capitalism, his negative examples couldn't occur.  Crony capitalism isn't a problem of private enterprise, its a problem of the increasingly powerful state.  Ditto with Google:  Sure I don't like having my data get sold to marketers, and at some point I may leave Google over it.  But the point is that I can leave Google .... try leaving your government-enforced monopoly utility provider.  Or go find an alternative to the DMV.

A great example of this contrast comes to us from Hawaii:

There may be some trouble brewing in paradise, thanks to a seemingly draconian law currently under consideration in Hawaii's state legislature. If passed, H.B. 2288 would require all ISPs within the state to track and store information on their customers, including details on every website they visit, as well as their own names and addresses. The measure, introduced on Friday, also calls for this information to be recorded on each customer's digital file and stored for a full two years. Perhaps most troubling is the fact that the bill includes virtually no restrictions on how ISPs can use (read: "sell") this information, nor does it specify whether law enforcement authorities would need a court order to obtain a user's dossier from an ISP. And, because it applies to any firm that "provides access to the Internet," the law could conceivably be expanded to include not just service providers, but internet cafes, hotels or other businesses.

Americans fed up with Google's nosiness can simply switch email providers.  But if they live in Hawaii, they will have no escape from the government's intrusiveness.

Bad Boys, Bad Boys

If nothing else, the OWS movement is helping ordinary Americans see the abuse of power that is so endemic in many police departments.  I am tired of the quasi-cult of police ass-kicking on average citizens, as fed by reality cop shows and folks like Joe Arpaio.  As Radley Balko points out, the casual way that the officer hoses down citizens who are just sitting on a curb with pepper spray is just outrageous.  From past experience, my guess is that these guys were ready to go limp and be dragged off - the pepper spray was just pure torture for the entertainment of the cops.

We would not do this to a terrorist in Gitmo, so why are we doing this to American citizens? I think I get particularly angry and intolerant of this kind of crap because I used to be the kind of law and order conservative that would excuse this kind of behavior, and that embarrasses me. The saying goes that a converted Catholic is often more fervent than a born one, so to I guess for this civil libertarian.

Outright Theft by Public Unions

Though it's a high bar given what has been going on recently, this is the most aggravating thing I have read this week, via Glen Reynolds:

Robert and Patricia Haynes live in Michigan with their two adult children, who have cerebral palsy. The state government provides the family with insurance through Medicaid, but also treats them as caregivers. For the SEIU, this makes them public employees and thus members of the union, which receives $30 out of the family's monthly Medicaid subsidy. The Michigan Quality Community Care Council (MQC3) deducts union dues on behalf of SEIU.

Michigan Department of Community Health Director Olga Dazzo explained the process in to her members of her staff.  "MQC3 basically runs the program for SEIU and passes the union dues from the state to the union," she wrote in an emailobtained by the Mackinac Center. Initiated in 2006 under then-Gov. Jennifer Granholm, D-Mich., the plan reportedly provides the SEIU with $6 million annually in union dues deducted from those Medicaid subsidies.

“We're not even home health care workers. We're just parents taking care of our kids,” Robert Haynes, a retired Detroit police officer, told the Mackinac Center for Public Policy. “Our daughter is 34 and our son is 30. They have cerebral palsy. They are basically like 6-month-olds in adult bodies. They need to be fed and they wear diapers. We could sure use that $30 a month that's being sent to the union.”

This is a microcosm of the typical liberal fail -- a group or agency does initial good work (private unions in the early 2oth century, civil rights groups in the 60's and 70's, the EPA in the early 70's) but refuse to go away and declare victory, instead morphing into self-sustaining parasites whose only concern is their own survival.

The Great Bailout

Peter Tchir via Zero Hedge

The AIG moment was the first time that the US threw any pretense of real capitalism out the window.  Bear Stearns at least was done by JPM with government help.  Fannie and Freddie were taken over, but they were always quasi government entities.  It was AIG that was truly special.  The government didn't even attempt to see if the banks had managed their exposures at all.  The government didn't even care if they had.  They panicked and saved the banks from their own folly - they didn't give capitalism a chance.  The US has never truly recovered from that.  The entire system looks to government support more and more.  Since AIG the Fed has been running at least one massive easing program or another constantly.  The government is lurching from spending program to spending program to keep the economy churning.

At the first signs of weakness we beg for the FED or ECB or the government to do something big and fast.  The European credit crisis seemed a final chance to put some capitalism back into capitalism.  To allow dumb decisions to pay the price for failure.  To reward the institutions that had properly navigated through the risks.  There was even a brief moment when it looked like Germany would do that - would force those who failed to pay the price and support those who had taken the best steps.  But now with Dexia bailed out and some super SIV on the way, it looks like we are once again heading down a path of not allowing failure - in fact we are once again rewarding failure and living beyond your means.  It isn't communism, but it certainly doesn't fit any classic definition of capitalism.

Time for Some Individual Action in NY

Folks in the OWS neighborhood in NYC are fed up and want the city to kick out the protesters.  While they grow old waiting for that, I would suggest taking some individual action right out of the army psi-ops manual (actually, its also from a Sopranos episode).

  1. Find some big-ass speakers
  2. Find the biggest amp you can
  3. Place speakers in window, point out at park.
  4. Find the single most annoying recording you can, and play it at volume 11 .. over and over and over and over, day in and day out.  I might try "I'm turning Japanese" or maybe "I want a hippopotamus for Christmas."  Possibly the song they used to play over and over in FAO Schwartz stores, or "It's a small world."   Or maybe something like a Joel Osteen sermon.  It almost doesn't matter once its been repeated 12 times an hour for 3 days.

Health Care Trojan Horse for Fascism

I have been warning you, its coming.  When government pays the health care bills, they can then use that as an excuse to micro-regulate our every behavior.  Because its no longer an individual choice, it affects public costs.

“Denmark finds every sort of way to increase our taxes,” said Alisa Clausen, a South Jutland resident. “Why should the government decide how much fat we eat? They also want to increase the tobacco price very significantly. In theory this is good — it makes unhealthy items expensive so that we do not consume as much or any and that way the health system doesn’t use a lot of money on patients who become sick from overuse of fat and tobacco.  However, these taxes take on a big brother feeling.  We should not be punished by taxes on items the government decides we should not use.”

As an aside, given that Scandinavians tend to have among the world's highest tolerances for taxes, when they get fed up, it must be getting bad.

Penn Jillette Awesomeness

Most of those who read the online libertarian rags have seen this, but its awesome enough to require repitition

What makes me libertarian is what makes me an atheist -- I don't know. If I don't know, I don't believe. I don't know exactly how we got here, and I don't think anyone else does, either. We have some of the pieces of the puzzle and we'll get more, but I'm not going to use faith to fill in the gaps. I'm not going to believe things that TV hosts state without proof. I'll wait for real evidence and then I'll believe.

And I don't think anyone really knows how to help everyone. I don't even know what's best for me. Take my uncertainty about what's best for me and multiply that by every combination of the over 300 million people in the United States and I have no idea what the government should do.

President Obama sure looks and acts way smarter than me, but no one is 2 to the 300 millionth power times smarter than me. No one is even 2 to the 300 millionth times smarter than a squirrel. I sure don't know what to do about an AA+ rating and if we should live beyond our means and about compromise and sacrifice. I have no idea. I'm scared to death of being in debt. I was a street juggler and carny trash -- I couldn't get my debt limit raised, I couldn't even get a debt limit -- my only choice was to live within my means. That's all I understand from my experience, and that's not much.

It's amazing to me how many people think that voting to have the government give poor people money is compassion. Helping poor and suffering people is compassion. Voting for our government to use guns to give money to help poor and suffering people is immoral self-righteous bullying laziness.

People need to be fed, medicated, educated, clothed, and sheltered, and if we're compassionate we'll help them, but you get no moral credit for forcing other people to do what you think is right. There is great joy in helping people, but no joy in doing it at gunpoint.

Who is at the other end of the spectrum?  Well, how about Brad Delong arguing for a return to technocratic rule by our betters

America's best hope for sane technocratic governance required the elimination of the Republican Party from our political system as rapidly as possible.

Technocratic utopia is of course a mirage, a supreme act of hubris, that any group of people could have the incentives or information required to manage the world top-down for us.  If I told an environmentalists that I wanted ten of the smartest biologists in the world to manage the Amazon top-down and start changing the ratios of species and courses of rivers and such in order to better optimize the rain forest, they would say I was mad.   Any such attempt would lead to disaster (just see what smart management has done for our US forests).  But the same folks will blithely advocate for top-down control of human economic activity.  The same folks who reject top-down creationism in favor of the emergent order of evolution reject the emergent order of markets and human uncoerced interaction in favor of top-down command and control.

More on technocrats here and here

This is Absurd

It is folks like this who continue to want to score the stimulus solely based on employment created by stimulus projects, without considering the fact that someone was using the money for some productive purpose before the government took or borrowed it.

David Brin at the Daily Kos via the South Bend Seven

There is nothing on Earth like the US tax code. It is an extremely complex system that nobody understands well. But it is unique among all the complex things in the world, in that it's complexity is perfectly replicated by the MATHEMATICAL MODEL of the system. Because the mathematical model is the system.

Hence, one could put the entire US tax code into a spare computer somewhere, try a myriad inputs, outputs... and tweak every parameter to see how outputs change. There are agencies who already do this, daily, in response to congressional queries. Alterations of the model must be tested under a wide range of boundary conditions (sample taxpayers.) But if you are thorough, the results of the model will be the results of the system.

Now. I'm told (by some people who know about such things) that it should be easy enough to create a program that will take the tax code and cybernetically experiment with zeroing-out dozens, hundreds of provisions while sliding others upward and then showing, on a spreadsheet, how these simplifications would affect, say, one-hundred representative types of taxpayers.

South Bend Seven have a number of pointed comments, but I will just offer the obvious:  Only half of the tax calculation is rates and formulas.  The other half is the underlying economic activity (such as income) to which the taxes are applied.  Brin's thesis falls apart for the simple reason that economic activity, and particularly income, are not variables independent of the tax code.  In fact, economic activity can be extremely sensitive to changes in the tax code.

The examples are all around us -- the 1990 luxury tax tanked high end boat sales.  The leveraged buyout craze of the 80's and housing bubble of the 00's are both arguably fed in part by the tax code's preference for debt.  The entire existence of employer-paid (rather than individual-paid) health insurance is likely a result of the tax code.  And of course there are all the supply-side and incentives effects that Kos readers likely don't accept but exist none-the-less.

Great Moments in Alarmism

From March 21, 1996 (via Real Science)

Scientists studying Creutzfeldt-Jakob Disease in the field are still deeply divided about whether BSE can be transmitted to humans, and about the potentially terrifying consequences for the population.

"It's too late for adults, but children should not be fed beef. It is as simple as that," said Stephen Dealler, consultant medical microbiologist at Burnley General Hospital, who has studied the epidemic nature of BSE and its human form, Creutzfeldt-Jakob Disease, since 1988.

He believes that the infectious agent would incubate in children and lead to an epidemic sometime in the next decade.

"Any epidemic in humans would start about 15 years after that in cattle, and about 250,000 BSE-infected cows were eaten in 1990. There could be an epidemic of this new form in the year 2005. These 10 cases were probably infected sometime before the BSE epidemic started."

His worst case scenario, assuming a high level of infection, would be 10 million people struck down by CJD by 2010. He thought it was now "too late" to assume the most optimistic scenario of only about 100 cases.

One of the great things about the Internet is that it is going to be much easier to hold alarmists accountable for wild scare-mongering predictions that prove to be absurd.  Though, I suppose Paul Ehrlich still gets respect in some quarters despite being 0-for-every-prediction-he-has-ever-made, so maybe its too much to hope for accountability.

Who Defused the Population Bomb?

Fred Pearce has a nice article (in Grist of all places) about how the Population Bomb essentially defused itself.

For a start, the population bomb that I remember being scared by 40 years ago as a schoolkid is being defused fast. Back then, most women round the world had five or six children. Today's women have just half as many as their mothers -- an average of 2.6. Not just in the rich world, but almost everywhere.

This is getting close to the long-term replacement level, which, allowing for girls who don't make it to adulthood, is around 2.3. Women are cutting their family sizes not because governments tell them to, but for their own good and the good of their families -- and if it helps the planet too, then so much the better....

And China. There, the communist government decides how many children couples can have. The one-child policy is brutal and repulsive. But the odd thing is that it may not make much difference any more. Chinese women round the world have gone the same way without compulsion. When Britain finally handed Hong Kong back to China in 1997, it had the lowest fertility in the world -- below one child per woman. Britain wasn't running a covert one-child policy. That was as many children as the women in Hong Kong wanted.

This is almost certainly one of those multiple-cause things, and we have always had the hypothesis that wealth and education reduced population growth.

But the author makes an interesting point, that urbanization, even in poorer countries, may a big driver as well.  After all, in the city, food and living space for children are expensive, and there are fewer ways children can support the family (I hadn't thought of this before, but I wonder if industrial child-labor restrictions, which mainly affected cities, had an impact on birth rates by making urban children less lucrative?)  In fact, urban jobs require educations which are expensive  (even if they are free, non-productive family members must be fed and housed for years).

Rorschach Test & Contempt of Cop

It is kind of an interesting exercise to compare the police account of this encounter with the video.   What do your eyes see?

I find it fascinating that so many commenters seem to believe that the police are entirely in the right to physically assault anyone who diss them.    One example:

For the 3 of you who commented above, I hope you never really need the cops.. You have no idea "what's called for" as you have no law enforcement training (watching "police academy" doesn't count). the Metro police go out there and do their job as best they can....

Bottom line, don't mouth off to cops or plan on carrying really good dental insurance.

Or this  (remember, all she did was use words):

She was told to leave, she left and came back and started in with the officer. Too bad for her, she asked for it.

Thanks, police, for making sure we don't ever have to encounter people in public who are not like ourselves

Finally Metro does something right. I ride the Metro regularly and I am sick and tired of this type of behavior. As a senior citizen I get fed up by the unruly behavior of today's youth. ... As for the cop, thank you

Or this one, where it is implied that it is the state's duty to use physical violence to enforce etiquette:

What kind of home schooling did she have? Why is she acting like this? I can't have any pity for her. She needs to take her uncivilized behavior somewhere else. Show some respect please. It appears she has no respect for authority or right or wrong. I feel for her parents if they should see this. Shameful, just shameful. The cop seems to just be doing his job. All she had to do was shut her sailor mouth and act like an adult.

Those who don't show respect for the state will be tackled and taken to jail.  Metro police might as well come on over to my house and drag we away, because I have no respect for you either.

It pains me to admit that 30 years ago I was just such a "law and order" Conservative.  Bleh.

Good for Google

Hopefully this is true, but it appears that Google is fed up with Chinese hijinx and is considering either pulling out of the country or insisting on being more open and less filtered.  I have given Google a lot of grief here for enabling Chinese censorship, so kudos if they are starting to rethink their relationship with China.

These attacks and the surveillance they have uncovered"“combined with the attempts over the past year to further limit free speech on the web"“have led us to conclude that we should review the feasibility of our business operations in China. We have decided we are no longer willing to continue censoring our results on Google.cn, and so over the next few weeks we will be discussing with the Chinese government the basis on which we could operate an unfiltered search engine within the law, if at all. We recognize that this may well mean having to shut down Google.cn, and potentially our offices in China.

Charity - Not In My Backyard!

Via a reader, from the AZ Republic:

A Phoenix ordinance banning charity dining halls in residential neighborhoods withstood a challenge by a north-central Phoenix church.

Retired Arizona Supreme Court Justice Robert Corcoran, serving as a hearing officer, ruled Monday that feeding the homeless at a place of worship can be banned by city ordinance. The decision affects all Phoenix churches with underlying residential zoning.

Over the summer, city officials maintained that CrossRoads United Methodist Church, 7901 N. Central Ave., violated Phoenix zoning code by feeding the poor and homeless on its property, a use that can only occur in commercial or industrial zones.

You will be relieved to know that this has nothing to do with a wealthy people fearing that their Xanax-induced equilibrium will be upset by actually seeing a poor person in their neighborhood.   We are assured as such by Paul Barnes, a "neighborhood activist" who presumably participated in the suit to stop the Church from holding pancake prayer-breakfasts:

"It's not a problem with homeless people in wealthy neighborhoods. That would be a matter of prejudice. This issue would be setting churches up to avoid zoning ordinances."

Wow, I am so relieved.  And we all know what a problem it is when churches are organized solely to evade zoning regulations.  Why, just last week the First Baptist Church and Gas Station as well as the United Methodist Church and Topless Bar opened right in my neighborhood.

You will be happy to know as well that the Constitution in no way limits the government in any way when it wants to regulate your property:

In a 19-page opinion, [Judge] Corcoran said the city can restrict where the homeless and poor can be fed and that zoning regulations apply to everyone equally. Additionally, he said that trumping land-use regulations is not a constitutional right.

Whew - yet another assault on the rights of government bureaucrats has been bravely turned aside.

Update: More random embedding of ads by the Republic.  They are putting them between words in the paragraph now.  RRRRRR.  Hopefully it is gone now.

A Tribute to Norman Borlaug

Norman Borlaug, the founder and driving force behind the revolution in high-yield agriculture that Paul Ehrlich predicted was impossible, has died at the age of 98 95.  Like Radley Balko, I am struck by how uneventful his passing is likely to be in contrast to the homage paid to self-promoting seekers of power like Ted Kennedy who never accomplished a tiny fraction of what Borlaug achieved.  Reason has a good tribute here.  Some exceprts:

In the late 1960s, most experts were speaking of imminent global famines in which billions would perish. "The battle to feed all of humanity is over," biologist Paul Ehrlich famously wrote in his 1968 bestseller The Population Bomb. "In the 1970s and 1980s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now." Ehrlich also said, "I have yet to meet anyone familiar with the situation who thinks India will be self-sufficient in food by 1971." He insisted that "India couldn't possibly feed two hundred million more people by 1980."

But Borlaug and his team were already engaged in the kind of crash program that Ehrlich declared wouldn't work. Their dwarf wheat varieties resisted a wide spectrum of plant pests and diseases and produced two to three times more grain than the traditional varieties. In 1965, they had begun a massive campaign to ship the miracle wheat to Pakistan and India and teach local farmers how to cultivate it properly. By 1968, when Ehrlich's book appeared, the U.S. Agency for International Development had already hailed Borlaug's achievement as a "Green Revolution."

In Pakistan, wheat yields rose from 4.6 million tons in 1965 to 8.4 million in 1970. In India, they rose from 12.3 million tons to 20 million. And the yields continue to increase. Last year, India harvested a record 73.5 million tons of wheat, up 11.5 percent from 1998. Since Ehrlich's dire predictions in 1968, India's population has more than doubled, its wheat production has more than tripled, and its economy has grown nine-fold. Soon after Borlaug's success with wheat, his colleagues at the Consultative Group on International Agricultural Research developed high-yield rice varieties that quickly spread the Green Revolution through most of Asia.

The contrast to Paul Ehrlich is particularly stunning.  Most folks have heard of Ehrlich and his prophesies of doom.   But Ehrlich has been wrong in his prophesies more times than anyone can count.  Borlaug fed a billion people while Ehrlich was making money and fame selling books saying that the billion couldn't be fed -- but few have even heard of Borlaug.   Today, leftists in power in the US and most European nations continue to reject Borlaug's approaches, and continue to revere Ehrlich (just this year, Obama chose a disciple of Ehrlich, John Holdren, as his Science czar).

Continuing proof that the world moves forward in spite of, rather than because of, governments.

Update: More here.

Update #2: Penn and Teller on Borlaug

Avoid Jericho, Arkansas at All Costs

Not many people have seen it, but one of my favorite movies is Interstate 60.  It has a story thread through the movie, but what it really becomes is a series of essays on freedom and slavery.  One the best parts is the town where everyone is a lawyer.  The only way anyone makes money is when someone breaks the law, so their laws are crafted such that it is impossible not to break the law.

The town of Jericho, Arkansas sounds very similar.  It has 174 residents, no businesses, but a police force of 6 that tries to find ways to support itself.  Apparently, everyone in town is constantly in court for traffic citations.  When one man got fed up, and yelled at the police in court for their stupid speed traps, the police shot him - right in the courtroom.  In a scene right out of Interstate 60, the DA, after investigating the shooting, couldn't remember the name of the police officer who did the shooting and said no charges would be filed against the police, but that misdemeanor charges were being considered against the man shot.  Probably for littering, due to his bleeding on the floor.

Via Radley Balko (who else?)

Holy *$%&#%

This graph of the US monetary supply is un-freaking-believeable.  Someone please tell me that this is a data error or something.  I guess this is one way to bail out borrowers -- if you create enough inflation, then the real value of principle owed drops.  Sure looks like it is time to borrow long at fixed rates.  Are real interest rates about to go negative?

money

Via Phil Miller

By the way, this really gives the lie to the whole government stimulus effort.  They may be moving large amounts of money around, but they can't create value, and in the absence of real value creation all they are doing is inflating the currency.

In Praise of Price Gouging

As I have pointed out any number of times, when supplies of something are short, you can allocate them either by price or by rationing.  Robert Rapier, via Michael Giberson made the point that combining shortages with tough state price-gouging laws inevitably led to rationing and long lines:

Someone asked during a panel discussion at ASPO whether we were going
to have rationing by price. I answered that we are having that now. But
prices aren't going up nearly as much as you would expect during these
sorts of severe shortages. Why? I think it's a fear that dealers have
of being prosecuted for gouging. So, they keep prices where they are,
and they simply run out of fuel when the deliveries don't arrive on
time. If they were allowed to raise prices sharply, people would cut
back on their driving and supplies would be stretched further.

Neal Boortz made the same point yesterday, as the gas shortages in the southeast dragged out (unsurprisingly) for a second week:

nearly 200 gas stations in Atlanta are being investigated for price gouging.  Don't investigate them!  Reward them!  Price gouging is exactly what we need!  It should be encouraged, not investigated....

The real problem now is panic buying.  People will run their tanks
down by about one-third and then rush off to a gas station.  Lines of
cars are following gas tanker trucks around Atlanta. The supplies are
coming back up, but as long as people insist on keeping every car they
own filled to the top and then filling a few gas cans to boot, we're
going to have these outages and these absurd lines. 

So, how do you stop the panic buying?  Easy.  You let the market do
what the market does best, control demand and supply through the price
structure.  The demand for gas outstrips the supply right now, so allow
gas stations respond by raising the price of gas .. raise it as much as
they want.  I'm serious here so stop your screaming.  The governor
should hold a press conference and announce that effective immediately
there is no limit on what gas stations can charge for gas.  I heard
that there was some gas station in the suburbs charging $8.00 a
gallon.  Great!  That's what they all should be doing.  Right now the
price of gasoline in Atlanta is artificially low and being held down by
government.  That's exacerbating the problem, not helping it.  Demand
is not being squelched by price. 

As the prices rise, the point will be reached where people will say
"I'm fed up with this.  I'll ride with a friend, take the bus or just
sit home before I'll pay this for a gallon of gas."  Once the price of
a gallon starts to evoke that kind of reaction, we're on our way to
solving the problem.  When gas costs, say, $8.00 people aren't going to
fill their tanks.  They also aren't going to rush home to get their
second car and make sure it is filled up either ... and you can forget
them filling those portable gas cans they have in the trunk.  Some
people will only be able to afford maybe five gallons!  Fine!  That
leaves gas in the tanks for other motorists.  Bottom line here is that
people aren't going to rush out to fill up their half-empty tanks with
$8.00 gas.

Here is something else to think of about lines and shortages.  What is the marginal value of your time?  I think most people underestimate this in their day to day transactions.  Some will say it is whatever they make an hour at work, and that is OK, but I will bet you that is low for most folks.  Most folks would not choose to work one more hour a week for their average hourly rate.  Start eating into my free time and family time, and my cost goes up.  That's why overtime rates are higher.   

So let's say an individual values his/her time at the margin for $25.  This means that an hour spent waiting in line or driving around town searching to fill up with 10 gallons raises the cost by $2.50 a gallon.  And this does not include the fuel or other wear on the car used in the search.  Or the cost of that sales meeting you missed because you did not have the gas to get there.  So an anti-gouging law that keeps prices temporarily down by a $1 or so a gallon may actually cost people much more from the shortages it creates.   

Solar Concentrating Plants

For a while, I have been writing that traditional silicon/germanium based solar-electric panels are not yet economic as an electricity source.

I have hopes for other technologies eventually making direct solar conversion to electricity.  However, there seems to be some activity in solar concentrating plants, where solar energy is reflected onto tubes to boil water and drive traditional steam turbines to generate electricity.  Fortune has an article on one such plant opening recently:

The completed solar arrays will be trucked to California where Ausra
is building a 177-megawatt solar power station for utility PG&E (PCG)  on 640 acres of agricultural land in San Luis Obispo County. (To see a video of the robots in action, click here.)
The arrays focus sunlight on water-filled tubes to create steam to
drive a turbine. Ausra manufacturing exec David McKay points to where
standard-issue boiler pipe will be fed into a machine and treated with
a proprietary coating that transforms it into a solar receiver.

I would love for this to work, but the article goes on to say that this approach still requires federal tax subsidies to compete with other electricity sources.  I am not very familiar with the economics of such plants.  Does anyone have a link or source that delves into the economics.  I am increasingly frustrated of late with alternate energy articles that fail to give any of the relevent economic info.  For example, I read an article in the Arizona Republic (sorry, lost the link) about Arizona's first wind project, but I could not get a sense from the article if the power was being purchased at market rates or some special inflated rate.

A Window into the Reality-Based Community

Dont_panic_earth_160w
Kevin Drum links to a blog called Three-Toed Sloth in a post about why our climate future may be even worse than the absurdly cataclysmic forecasts we are getting today in the media.  Three-Toed Sloth advertises itself as "Slow Takes from the Canopy of the Reality-Based Community."  His post is an absolutely fabulous example how one can write an article where most every line is literally true, but the conclusion can still be dead wrong because one tiny assumption at the beginning of the analysis was incorrect  (In this case, "incorrect" may be generous, since the author seems well-versed in the analysis of chaotic systems.  A better word might be "purposely fudged to make a political point.")

He begins with this:

The
climate system contains a lot of feedback loops.  This means that the ultimate
response to any perturbation or forcing (say, pumping 20 million years of
accumulated fossil fuels into the air) depends not just on the initial
reaction, but also how much of that gets fed back into the system, which leads
to more change, and so on.  Suppose, just for the sake of things being
tractable, that the feedback is linear, and the fraction fed back
is f.  Then the total impact of a perturbation I is


J + Jf + Jf2 + Jf3 + ...

The infinite series of tail-biting feedback terms is in fact
a geometric
series
, and so can be summed up if f is less than 1:


J/(1-f)

So far, so good.  The math here is entirely correct.  He goes on to make this point, arguing that if we are uncertain about  f, in other words, if there is a distribution of possible f's, then the range of the total system gain 1/(1-f) is likely higher than our intuition might first tell us:

If we knew the value of the feedback f, we could predict the
response to perturbations just by multiplying them by 1/(1-f) "”
call this G for "gain".  What happens, Roe and Baker ask, if we do not
know the feedback exactly?  Suppose, for example, that our measurements are
corrupted by noise --- or even, with something like the climate,
that f is itself stochastically fluctuating.  The distribution of
values for f might be symmetric and reasonably well-peaked around a
typical value, but what about the distribution for G?  Well, it's
nothing of the kind.  Increasing f just a little increases
G by a lot, so starting with a symmetric, not-too-spread distribution
of f gives us a skewed distribution for G with a heavy right
tail.

Again all true, with one small unstated proviso I will come back to.  He concludes:

In short: the fact that we will probably never be able to precisely predict
the response of the climate system to large forcings is so far from being a
reason for complacency it's not even funny.

Actually, I can think of two unstated facts that undermine this analysis.  The first is that most catastrophic climate forecasts you see utilize gains in the 3x-5x range, or sometimes higher (but seldom lower).  This implies they are using an f of between .67 and .80.  These are already very high numbers for any natural process.  If catastrophist climate scientists are already assuming numbers at the high end of the range, then the point about uncertainties skewing the gain disproportionately higher are moot.  In fact, we might tend to actually draw the reverse conclusion, that the saw cuts both ways.  His analysis also implies that small overstatements of f when the forecasts are already skewed to the high side will lead to very large overstatements of Gain.

But here is the real elephant in the room:  For the vast, vast majority of natural processes, f is less than zero.  The author has blithely accepted the currently unproven assumption that the net feedback in the climate system is positive.  He never even hints at the possibility that that f might be a negative feedback rather than positive, despite the fact that almost all natural processes are dominated by negative rather than positive feedback.  Assuming without evidence that a random natural process one encounters is dominated by negative feedback is roughly equivalent to assuming the random person you just met on the street is a billionaire.  It is not totally out of the question, but it is very, very unlikely.

When one plugs an f in the equation above that is negative, say -0.3, then the gain actually becomes less than one, in this case about 0.77.  In a negative feedback regime, the system response is actually less than the initial perturbation because forces exist in the system to damp the initial input.

The author is trying to argue that uncertainty about the degree of feedback in the climate system and therefore the sensitivity of the system to CO2 changes does not change the likelihood of the coming "catastrophe."  Except that he fails to mention that we are so uncertain about the feedback that we don't even know its sign.  Feedback, or f, could be positive or negative as far as we know.  Values could range anywhere from -1 to 1.  We don't have good evidence as to where the exact number lies, except to observe from the relative stability of past temperatures over a long time frame that the number probably is not in the high positive end of this range.  Data from climate response over the last 120 years seems to point to a number close to zero or slightly negative, in which case the author's entire post is irrelevant.   In fact, it turns out that the climate scientists who make the news are all clustered around the least likely guesses for f, ie values greater than 0.6.

Incredibly, while refusing to even mention the Occam's Razor solution that f is negative, the author seriously entertains the notion that f might be one or greater.  For such values, the gain shoots to infinity and the system goes wildly unstable  (nuclear fission, for example, is an f>1 process).  In an f>1 world, lightly tapping the accelerator in our car would send us quickly racing up to the speed of light.  This is an ABSURD assumption for a system like climate that is long-term stable over tens of millions of years.  A positive feedback f>=1 would have sent us to a Venus-like heat or Mars-like frigidity eons ago.

A summary of why recent historical empirical data implies low or negative feedback is here.  You can learn more on these topics in my climate video and my climate book.  To save you the search, the section of my movie explaining feedbacks, with a nifty live demonstration from my kitchen, is in the first three and a half minutes of the clip below:

Chapter 4: The Historical Evidence for Man-made Global Warming (Skeptics Guide to Global Warming)

The table of contents for the rest of this paper, . 4A Layman's Guide to Anthropogenic Global Warming (AGW) is here Free pdf of this Climate Skepticism paper is here and print version is sold at cost here

I mentioned earlier that there is little or no empirical
evidence directly linking increasing CO2 to the current temperature changes in the
Earth (at least outside of the lab), and even less, if that is possible, linking man's contribution to CO2
levels to global warming.  It is important to note that this lack of
empirical data is not at all fatal to the theory.  For example, there is a
thriving portion of the physics community developing string theory in great
detail, without any empirical evidence whatsoever that it is a correct
representation of reality. Of course, it is a bit difficult to call a theory
with no empirical proof "settled" and, again using the example of string
theory, no one in the physics community would seriously call string theory a
settled debate, despite the fact it has been raging at least twice as long as
the AGW debate.

One problem is that AGW is a fairly difficult proposition to
test.  For example, we don't have two Earths such that we could use one as
the control and one as the experiment.  Beyond laboratory tests, which
have only limited usefulness in explaining the enormously complex global
climate, most of the attempts to develop empirical evidence have involved
trying to develop and correlate historical CO2 and temperature records.
If such records could be developed, then temperatures could be tested against
CO2 and other potential drivers to find correlations.  While there is
always a danger of finding false causation in correlations, a strong historical
temperature-CO2 correlation would certainly increase our confidence in AGW
theory. 

Five to seven years ago, climate scientists thought they had
found two such smoking guns:  one in ice core data going back 650,000
years, and one in Mann's hockey stick using temperature proxy data going back
1,000 years.  In the last several years, substantial issues have arisen
with both of these analyses, though this did not stop Al Gore from using both
in his 2006 film.

Remember what we said early on.  The basic "proof" of
anthropogenic global warming theory outside the laboratory is that CO2 rises
have at least a loose correlation with warming, and that scientists "can't
think of anything else" that might be causing warming other than CO2.

The long view
(650,000 years)

When I first saw it years ago, I thought one of the more compelling charts
from Al Gore's PowerPoint deck, which was made into the movie An Inconvenient
Truth
, was the six-hundred thousand year close relationship between
atmospheric CO2 levels and global temperature, as discovered in ice core
analysis.  Here is Al Gore with one of those great Really Big Charts.

If you are connected to the internet, you can watch this segment of Gore's
movie at YouTube
.   I will confess that this segment is
incredibly powerful -- I mean, what kind of Luddite could argue with this
Really Big Chart?

Because it is hard to read in the movie, here is the data set that Mr. Gore
is drawing from, taken from page 24 of the recent fourth IPCC report.

Unfortunately, things are a bit more complicated than presented by Mr. Gore
and the IPCC.  In fact, Gore is really, really careful how he narrates
this piece.  That is because, by the time this movie was made, scientists
had been able to study the ice core data a bit more carefully.  When they
first measured the data, their time resolution was pretty course, so the two
lines looked to move together.  However, with better laboratory procedure,
the ice core analysts began to find something funny.  It turns out that
for any time they looked at in the ice core record, temperatures actually
increased on average 800 years before CO2 started to increase.
When event B occurs after event A, it is really hard to argue that B caused A.

So what is really going on?  Well, it turns out that most of the
world's CO2 is actually not in the atmosphere, it is dissolved in the
oceans.  When global temperatures increase, the oceans give up some of
their CO2, outgassing it into the atmosphere and increasing atmospheric
concentrations.  Most climate scientists today (including AGW supporters)
agree that some external force (the sun, changes in the Earth's tilt and
rotation, etc) caused an initial temperature increase at the beginning of the
temperature spikes above, which was then followed by an increase in atmospheric
CO2 as the oceans heat up.

What scientists don't agree on is what happens next.
Skeptics tend to argue that whatever caused the initial temperature
increase drives the whole cycle.  So, for example, if the sun caused the
initial temperature increase, it also drove the rest of the increase in that
cycle.  Strong AGW supporters on the other hand argue that while the sun
may have caused the initial temperature spike and outgassing of CO2 from the
oceans, further temperature increases were caused by the increases in CO2.

The AGW supporters may or may not be right about this two-step
approach.   However, as you can see, the 800-year lag substantially
undercuts the ice core data as empirical proof that CO2 is the main driver of
global temperatures, and completely disproves the hypothesis that CO2 is the
only key driver of global temperatures.  We will return to this 800-year
lag and these two competing explanations later when we discuss feedback loops.

The medium view
(1000 years)

Until about 2000, the dominant reconstruction of the last
1000 years of global temperatures was similar to that shown in this chart from
the 1990 IPCC report:

1000yearold

There are two particularly noticeable features on this
chart.  The first is what is called the "Medieval Warm Period", peaking in
the 13th century, and thought (at least 10 years ago) to be warmer
than our climate today.  The second is the "Little Ice Age" which ended at
about the beginning of the industrial revolution.  Climate scientists
built this reconstruction with a series of "proxies", including tree rings and
ice core samples, which (they hope) exhibit properties that are strongly
correlated with historical temperatures.

However, unlike the 650,000 year construction, scientists
have another confirmatory source for this period: written history.
Historical records (at least in Europe) clearly show that the Middle Ages was
unusually warm, with long growing seasons and generally rich harvests (someone
apparently forgot to tell Medieval farmers that they should have smaller crops
in warmer weather).  In Greenland, we know that Viking farmers settled in
what was a much warmer period in Greenland than we have today (thus the oddly
inappropriate name for the island) and were eventually driven out by falling
temperatures.  There are even clearer historical records for the Little
Ice Age, including accounts of the Thames in London and the canals in Amsterdam
freezing on an annual basis, something that happened seldom before or since.

Of course, these historical records are imperfect.  For
example, our written history for this period only covers a small percentage of
the world's land mass, and land only covers a small percentage of the world's
surface.  Proxies, however have similar problems.  For example, tree
rings only can come from a few trees that cover only a small part of the
Earth's surface.  After all, it is not every day you bump into a tree that
is a thousand years old (and that anyone will let you cut down to look at the
rings).  In addition, tree ring growth can be covariant with more than
just temperature (e.g. precipitation);  in fact, as we continue to study
tree rings, we actually find tree ring growth diverging from values we might
expect given current temperatures (more on this in a bit).

Strong AGW supporters found the 1990 IPCC temperature
reconstruction shown above awkward for their cause.  First, it seemed to
indicate that current higher temperatures were not unprecedented, and even
coincided with times of relative prosperity.  Further, it seems to show
that global temperatures fluctuate widely and frequently, thus begging the
question whether current warming is just a natural variation, an expected
increase emerging from the Little Ice Age.

So along comes strong AGW proponent (and RealClimate.org
founder) Michael Mann of the University of Massachusetts.  Mann
electrified the climate world, and really the world as a whole, with his revised
temperature reconstruction, shown below, and called "the Hockey Stick."

1000yearold

Gone was the Little Ice Age.  Gone was the Medieval
Warm Period.  His new reconstruction shows a remarkably stable, slightly
downward trending temperature record that leaps upward in 1900.  Looking
at this chart, who could but doubt that our current global climate experience
was something unusual and unprecedented.  It is easy to look at this chart
and say "“ wow, that must be man-made!

In fact, the hockey stick chart was used by AGW supporters
in just this way.  Surely, after a period of stable temperatures, the 20th
century jump is an anomaly that seems to point its finger at man (though if one
stops the chart at 1950, before the period of AGW, the chart, interestingly, is
still a hockey stick, though with only natural causes).

Based on this analysis, Mann famously declared that the 1990's were the
warmest decade in a millennia and that "there is a 95 to 99% certainty
that 1998 was the hottest year in the last one thousand years." (By
the way, Mann now denies he ever made this claim, though you can watch him say
these exact words in the CBC documentary Global
Warming:  Doomsday Called Off
).
   If this is not hubris
enough, the USAToday
published a graphic
, based on Mann's analysis and which is still online as
of this writing, which purports to show the world's temperature within .0001
degree for every year going back two thousand years!

To reconcile historical written records with this new view of climate
history, AGW supporters argue that the Medieval Warm Period (MWP) was limited
only to Europe and the North Atlantic (e.g. Greenland) and in fact the rest of
the world may not have been warmer. Ice core analyses have in fact verified a MWP
in Greenland, but show no MWP in Antarctica (though, as I will show later,
Antarctica is not warming yet in the current warm period, so perhaps Antarctic
ice samples are not such good evidence of global warming).  AGW
supporters, then, argue that our prior belief in a MWP was based on written
records that are by necessity geographically narrowly focused.  Of course,
climate proxy records are not necessarily much better.  For example, from
the fourth IPCC report, page 55, here are the locations of proxies used to
reconstruct temperatures in AD1000:

As seems to be usual in these reconstructions, there were a lot of arguments
among scientists about the proxies Mann used, and, just as important, chose not
to use.  I won't get into all that except to say that many other climate archaeologists did not and do not agree with his choice of proxies and still
support the existence of a Little Ice Age and a Medieval Warm Period.
There also may be systematic errors in the use of these proxies which I will
get to in a minute. 

But some of Mann's worst
failings were in the realm of statistical methodology.  Even as a layman,
I was immediately able to see a problem with the hockey stick:  it shows a
severe discontinuity or inflection point at the exact same point that
the data source switches between two different data sets (i.e.  from
proxies to direct measurement).  This is quite problematic.
  Syun-Ichi Akasofu makes the observation that when you don't try to
splice these two data sets together, and just look at one (in this case,
proxies from Arctic ice core data as well as actual Arctic temperature
measurements) the result is that the 20th century warming in fact
appears to be part of a 250 year linear trend, a natural recovery from the
little ice age  (the scaling for the ice core data at top is a chemical
composition variable thought to be proportional to temperature).

However, the real bombshell was dropped on Mann's work by a couple of
Canadian scientists named Stephen McIntyre and Ross McKitrick (M&M).
M&M had to fight an uphill battle, because Mann resisted their third party
review of his analysis at every turn, and tried to deny them access to his data
and methodology, an absolutely unconscionable violation of the principles of
science (particularly publicly funded science).  M&M got very good at
filing Freedom of Information Act Requests (or the Canadian equivalent)

Eventually, M&M found massive flaws with Mann's statistical approach,
flaws that have since been confirmed by many experts, such that there are few
people today that treat Mann's analysis seriously (At best, his supporters
defend his work with a mantra roughly akin to "fake but accurate."  I'll
quote the MIT
Technology Review
for M&M's key finding:

But now a shock: Canadian scientists Stephen
McIntyre and Ross McKitrick have uncovered a fundamental mathematical flaw in
the computer program that was used to produce the hockey stick. "¦

[Mann's] improper normalization procedure tends to
emphasize any data that do have the hockey stick shape, and to suppress all
data that do not. To demonstrate this effect, McIntyre and McKitrick created
some meaningless test data that had, on average, no trends. This method of
generating random data is called Monte Carlo analysis, after the famous casino,
and it is widely used in statistical analysis to test procedures. When
McIntyre and McKitrick fed these random data into the Mann procedure, out
popped a hockey stick shape!

A more
complete description of problems with Mann hockey stick can be found at this
link
.  Recently, a US Congressional Committee asked a group of
independent statisticians led by Dr. Edward Wegman, Chair of the National
Science Foundation's Statistical Sciences Committee, to evaluate the Mann
methodology.  Wegman et. al. savaged the Mann methodology as well as the
peer review process within the climate community.  From their findings:

It is important to note the isolation of the
paleoclimate community; even though they rely heavily on statistical methods
they do not seem to be interacting with the statistical community.
Additionally, we judge that the sharing of research materials, data and results
was haphazardly and grudgingly done. In this case we judge that there was too
much reliance on peer review, which was not necessarily independent. Moreover,
the work has been sufficiently politicized that this community can hardly
reassess their public positions without losing credibility. Overall, our committee
believes that Dr. Mann's assessments that the decade of the 1990s was the
hottest decade of the millennium and that 1998 was the hottest year of the
millennium cannot be supported by his analysis.

In 2007, the IPCC released its new climate report, and the
hockey stick, which was the centerpiece bombshell of the 2001 report, and which
was the "consensus" reconstruction of this "settled" science, can hardly be
found.  There is nothing wrong with errors in science; in fact, science is
sometimes advanced the most when mistakes are realized.  What is worrying
is the unwillingness by the IPCC to acknowledge a mistake was made, and to try
to learn from that mistake.  Certainly the issues raised with the hockey
stick are not mentioned in the most recent IPCC report, and an opportunity to
be a bit introspective on methodology is missed.  M&M, who were ripped
to shreds by the global warming community for daring to question the hockey
stick, are never explicitly vindicated in the report.  The climate
community slunk away rather than acknowledging error.

In response to the problems with the Mann analysis, the IPCC
has worked to rebuild confidence in its original conclusion (i.e. that recent
years are the hottest in a millennium) using the same approach it often
does:  When one line on the graph does not work, use twelve: 

As you can see, most of these newer analyses actually outdo
Mann by showing current warming to be even more pronounced than in the past
(Mann is the green line near the top).  This is not an unusual phenomenon
in global warming, as new teams try to outdo each other (for fame and funding)
in the AGW sales sweepstakes.  Just as you can tell the newest climate
models by which ones forecast the most warming, one can find the most recent
historical reconstructions by which ones show the coldest past. 

Where to start?  Well, first, we have the same problem
here that we have in Mann:  Recent data from an entirely different data
set (the black line) has been grafted onto the end of proxy data.  Always
be suspicious of inflection points in graphs that occur exactly where the data
source has changed.  Without the black line from an entirely different data set grafted on, the data would not form a hockey stick, or show anything particularly anomalous about the 20th century.  Notice also a little trick, by the way "“ observe how
far the "direct measurement" line has been extended.  Compare this to the
actual temperatures in the charts above.  The authors have taken the
liberty to extend the line at least 0.2 degrees past where it actually should
be to make the chart look more dramatic.

There are, however, some skeptics conclusions that can be
teased out of this data, and which the IPCC completely ignores.  For
example, as more recent studies have deepened the little ice age around
1600-1700, the concurrent temperature recovery is steeper (e.g. Hegerl 2007 and
Moberg 2005) such that without the graft of the black line, these proxies make
the 20th century look like part of the fairly linear temperature
increase since 1700 or at least 1800.

But wait, without that black line grafted on, it looks like the
proxies actually level off in the 20th century!  In fact, from
the proxy data alone, it looks like the 20th century is nearly
flat.  In fact, this effect would have been even more dramatic if lead
author Briffa hadn't taken extraordinary liberties with the data in his
study.   Briffa (who replaced Mann as the lead author on this section
for the Fourth Report) in 2001 initially showed proxy-based temperatures falling
in the last half of the 20th century until he dropped out a bunch of
data points by truncating the line around 1950.  Steve McIntyre has
reconstructed the original Briffa analysis below without the truncation (pink
line is measured temperatures, green line is Briffa's proxy data).  Oops.

 

 

 

 

 

 

 

 

 

 

Note that this ability to just drop out data that does not
fit is NOT a luxury studies have in the era before the temperature record
existed.  By the way, if you are wondering if I am being fair to Briffa,
here is his explanation
for why he truncated
:

In the absence of a substantiated
explanation for the decline, we make the assumption that it is likely to be a
response to some kind of recent anthropogenic forcing. On the basis of this
assumption, the pre-twentieth century part of the reconstructions can be
considered to be free from similar events and thus accurately represent past
temperature variability.

Did you get that?  "Likely to be a response to some
kind of recent anthropogenic forcing."  Of course, he does not know what
that forcing on his tree rings is and can't prove this statement, but he throws
the data out none-the-less.  This is the editor and lead author for the
historical section of the IPCC report, who clearly has anthropogenic effects on
the brain.  Later studies avoided Briffa's problem by cherry-picking data
sets to avoid the same result.

We'll get back to this issue of the proxies diverging from
measured temperatures in the moment.  But let's take a step back and ask
"So should 12 studies telling the same story (at least once they are truncated
and "corrected') make us more confident in the answer?"  It is at this
point that it is worth making a brief mention of the concept of "systematic
error."   Imagine the problem of timing a race.  If one feared
that any individual might make a mistake in timing the race, he could get say
three people to time the race simultaneously, and average the results.
Then, if in a given race, one person was a bit slow or fast on the button, his
error might be averaged out with the other two for a result hopefully closer to
the correct number.  However, let's say that all three are using the same
type of watch and this type watch always runs slow.  In this case, no amount
of extra observers are going to make the answer any better "“ all the times will
be too low.  This latter type of error is called systematic error, and is
an error that, due to some aspect of a shared approach or equipment or data
set, multiple people studying the same problem can end up with the same error.

There are a couple of basic approaches that all of these
studies share.  For example, they all rely heavily on the same tree ring
proxies (in fact the same fifty or sixty trees), most of which are of one species
(bristlecone pine).  Scientists look at a proxy, such as tree rings, and
measure some dimension for each year.  In this case, they look at the tree
growth.  They compile this growth over hundreds of years, and get a data
set that looks like 1999- .016mm, 1998, .018mm  etc.  But how does
that correlate to temperature? What they do is pick a period, something like
1960-1990, and look at the data and say "we know temperatures average X from
1980 to 1990.  Since the tree rings grew Y, then we will use a scaling
factor of X/Y to convert our 1000 years of tree ring data to
temperatures. 

I can think of about a million problems with this.
First and foremost, you have to assume that temperature is the ONLY driver for
the variation in tree rings.  Drought, changes in the sun, changing soil
composition or chemistry,  and even CO2 concentration substantially affect
the growth of trees, making it virtually impossible to separate out temperature
from other environmental effects in the proxy.

Second, one is forced to assume that the scaling  of
the proxy is both linear and constant.  For example, one has to assume a
change from, say, 25 to 26 degrees has the same relative effect on the proxy as
a change from 30 to 31 degrees.  And one has to assume that this scaling
is unchanged over a millennium.  And if one doesn't assume the scaling is
linear, then one has the order-of-magnitude harder problem of deriving the
long-term shape of the curve from only a decade or two of data.  For a
thousand years, one is forced to extrapolate this scaling factor from just one
or two percent of the period.

But here is the problem, and a potential source for
systematic error affecting all of these studies:  Current proxy data is
wildly undershooting prediction of temperatures over the last 10-20
years.  In fact, as we learned above, the proxy data actually shows little
or no 20th century warming.  Scientists call this "divergence"
of the proxy data.  If Briffa had hadn't artificially truncated his data
at 1950, the effect would be even more dramatic.  Below is a magnification
of the spaghetti chart from above "“ remember the black line is "actual," the
other lines are the proxy studies.

 

 

 

 

In my mind, divergence is quite damning.  It implies
that scaling derived from 1960-1980 can't even hold up for the next decade,
much less going back 1000 years!  And if proxy data today can be
undershooting actual temperatures (by a wide margin) then it implies it could
certainly be undershooting reality 700 years ago.  And recognize that I am
not saying one of these studies is undershooting "“ they almost ALL are
undershooting, meaning they may share the same systematic error.  (It
could also mean that measured surface temperatures are biased high, which we
will address a bit later.

The short view (100
years)

The IPCC reports that since 1900, the world's surface has
warmed about 0.6C, a figure most folks will accept (with some provisos I'll get
to in a minute about temperature measurement biases).  From
the NOAA Global Time Series:

Temperatureline

This is actually about the same data in the Mann hockey stick chart -- it
only looks less frightening here (or more frightening in Mann) due to the
miracle of scaling.  Next, we can overlay CO2:

Historic_co2

This chart is a real head-scratcher for scientists trying to
prove a causal relationship between CO2 and global temperatures.  By
theory, temperature increases from CO2 should be immediate, though the oceans
provide a big thermal sink that to this day is not fully understood.
However, from 1880 to 1910, temperatures declined despite a 15ppm increase in
CO2.  Then, from 1910 to 1940 there was another 15ppm increase in CO2 and
temperatures rose about 0.3 degrees.  Then, from 1940-1979, CO2 increased
by 30 ppm while temperatures declined again.  Then, from 1980 to present,
CO2 increased by 40 ppm and temperatures rose substantially.  By grossly
dividing these 125 years into these four periods, we see two long periods
totaling 70 years where CO2 increases but temperature declines and two long
periods totaling 55 years of both CO2 and temperature increases. 

By no means does this variation disprove a causal relation
between CO2 concentrations and global temperature.  However, it also can
be said that this chart is by no means a slam dunk testament to such a
relationship.  Here is how strong AGW supporters explain this data:
Strong AGW supporters will assign most, but not all, of the temperature
increase before 1950 to "natural" or non-anthropogenic causes.  The current
IPCC report in turn assigns a high probability that much or all of the warming after
1950 is due to anthropogenic sources, i.e. man-made CO2.  Which still
leaves the cooling between 1940 and 1979 to explain, which we will cover
shortly.

Take this chart from the fourth IPCC report (the blue band
is what the IPCC thinks would have happened without anthropogenic effects, the
pink band is their models' output with man's influence, and the black line is
actual temperatures (greatly smoothed).

Scientists know that "something" caused the pre-1950
warming, and that something probably was natural, but they are not sure exactly
what it was, except perhaps a recovery from the little ice age.  This is
of course really no answer at all, meaning that this is just something we don't
yet know.  Which raises the dilemma: if whatever natural effects were
driving temperatures up until 1950 cannot be explained, then how can anyone say
with confidence that this mystery effect just stops after 1950, conveniently at
the exact same time anthropogenic warming "takes over"?  As you see here,
it is assumed that without anthropogenic effects, the IPCC thinks the world
would have cooled after 1950.  Why?  They can't say.  In fact, I
will show later that this assumption is really just a necessary plug to prevent
their models from overestimating historic warming.  There is good evidence
that the sun has been increasing its output and would have warmed the world,
man or no man, after 1950. 

But for now, I leave you with the question "“ If we don't
know what natural forcing caused the early century warming, then how can we say
with confidence it stopped after 1950?
  (By the way, for those of you
who already know about global cooling/dimming and aerosols, I will just say for
now that these effects cannot be making the blue line go down because the IPCC
considers these anthropogenic effects, and therefore in the pink band.
For those who have no idea what I am talking about, more in a bit).

Climate scientist Syun-Ichi Akasofu of the International
Arctic Research Center at University of Alaska Fairbanks makes
a similar point
, and highlights the early 20th century
temperature rise:

Again, what drove the Arctic warming up through 1940?
And what confidence do we have that this forcing magically went away and has
nothing to do with recent temperature rises?

Sulfates, Aerosols,
and Dimming

Strong AGW advocates are not content to say that CO2 is one
factor among many driving climate change.  They want to be able to say CO2
is THE factor.  To do so with the historical record over the last 100
years means they need to explain why the world cooled rather than warmed from
1940-1979.

Strong AGW supporters would prefer to forget the global
cooling hysteria in the 1970s.  During that time, the media played up
scientific concerns that the world was actually cooling, potentially
re-entering an ice age, and that crop failures and starvation would
ensue.  (It is interesting that AGW proponents also predict agricultural
disasters due to warming.  I guess this means that we are, by great coincidence,
currently at the exact perfect world temperature for maximizing agricultural
output, since either cooling or warming would hurt production).  But even
if they want to forget the all-too-familiar hysteria, they still need to
explain the cooling.

What AGW supporters need is some kind of climate effect that
served to reduce temperatures starting in 1940 and that went away around
1980.  Such an effect may actually exist.

There is a simple experiment that meteorologists have run
for years in many places around the world.  They take a pan of water of
known volume and surface area and put it outside, and observe how long it takes
for the water to evaporate.  If one correctly adjusts the figures to
reflect changes in temperature and humidity, the resulting evaporation rate
should be related to the amount of solar irradiance reaching the pan.  In
running these experiments, there does seem to be a reduction of solar
irradiance reaching the Earth, perhaps by as much as 4% since 1950.  The
leading hypothesis is that this dimming is from combustion products including
sulfates and particulate matter, though at this point this is more of a
hypothesis than demonstrated cause and effect.  The effect is often called
"global dimming."

The aerosol hypothesis is that sulfate aerosols and black carbon are the
main cause of global dimming, as they tend to act to cool the Earth by
reflecting and scattering sunlight before it reaches the ground.  In
addition, it is hypothesized that these aerosols as well as particulates from
combustion may act to seed cloud formation in a way that makes clouds more
reflective.  The nations of the world are taking on sulfate and
particulate production, and will likely substantially reduce this production
long before CO2 production is reduced (mainly because it is possible with
current technology to burn fossil fuels with greatly reduced sulfate output,
but it is not possible to burn fossil fuels with greatly reduced CO2
output).  If so, we might actually see an upward acceleration in
temperatures if aerosols are really the cause of dimming, since their removal
would allow a sort-of warming catch-up.

Sulfates do seem to be a pretty good fit with the cooling
period, but a couple of things cause the fit to be well short of perfect.
First, according to Stern,
production of these aerosols worldwide (right) did not peak until 1990, at
level almost 20% higher than they were in the late 1970's when the global
cooling phenomena ended. 

One can also observe that sulfate production has not fallen
that much, due to new contributions from China and India and other developing
nations (interestingly, early drafts of the fourth IPCC report hypothesized
that sulfate production may not have decreased at all from its peak, due to
uncertainties in Asian production).  Even today, sulfate levels have not
fallen much below where they were in the late 1960's, at the height of the
global cooling phenomena, and higher than most of the period from 1940 to 1979
where their production is used to explain the lack of warming.

Further, because they are short-lived, these sulfate dimming effects really only can be
expected to operate over in a few isolated areas around land-based industrial areas, limiting their effect on global temperatures
since they effect only a quarter or so of the globe.   You can see this below, where high sulfate aerosol concentrations, show in orange and red, only cover a small percentage of the globe.

Sulfate2

Given these areas, for the whole world to be cooled 1 degree C by aerosols and black carbon, the areas in orange and red would have to cool 15 or 20C, which absolutely no one has observed.  In fact, since as you can see, most of these aerosols are in the norther hemisphere, one would expect that, if cooling were a big deal, the northern hemisphere would have cooled vs. the southern, but in fact as we will see in a minute exactly the opposite is true -- the northern hemisphere is heating much faster than the south.  Research
has shown that dimming is three times greater in urban areas close to where the
sulfates are produced (and where most university evaporation experiments are
conducted) than in rural areas, and that in fact when you get out of the
northern latitudes where industrial society dominates, the effect may actually
reverse in the tropics.

There are, though, other potential explanations for
dimming.  For example, dimming may be an effect of global warming
itself.  As I will discuss in the section on feedback processes later,
most well-regulated natural systems have feedback mechanisms that tend to keep
trends in key variables from "running away."  In this case, warming may be
causing cloud formation due to increased evaporation from warmer oceans.

It is also not a done deal that test evaporation from pans
necessarily represents the rate of terrestrial evaporation.  In fact,
research has shown that pan evaporation can decrease because surrounding
evaporation increases, making the pan evaporation more an effect of atmospheric
water budgets and contents than irradiance.

This is a very important area for research, but as with
other areas where promoters of AGW want something to be true, beware what you
hear in the media about the science.  The IPCC's fourth report continues
to say that scientific understanding of many of these dimming issues is
"low."  Note also that global dimming does not "prove" AGW by any means,
it merely makes the temperature-CO2 correlation better in the last half of the
20th century.  All the other issues we have discussed remain.

The Troposphere
Dilemma and Urban heat islands

While global dimming may be causing us to under-estimate the
amount of global warming, other effects may be causing us to over-estimate
it.  One of the mysteries in climate science today has to do with
different rates of warming on the Earth's surface and in the troposphere (the
first 10km or so of atmosphere above the ground).  AGW theory is pretty
clear "“ the additional heat that is absorbed by CO2 is added to the
troposphere, so the troposphere should experience the most warming from
greenhouse gasses.  Some but not all of this warming will transfer to the
surface, such that we should expect temperature increases from AGW to be larger
in the troposphere than at the surface.

Well, it turns out that we have two ways to measure
temperature in the troposphere.  For decades, weather balloons have been
sent aloft to take temperature readings at various heights in the
atmosphere.  Since the early 70's, we have also had satellites capable of
mapping temperatures in the troposphere.  From Spencer and Christy, who
have done the hard work stitching the satellite data into a global picture,
comes this chart of satellite-measured temperatures in the troposphere.
The top chart is Global, the middle is the Northern Hemisphere, the bottom is
the Southern Hemisphere

You will probably note a couple of interesting things.
The first is that while the Northern hemisphere has apparently warmed about a
half degree over the last 20 years, the Southern hemisphere has not warmed at
all, at least in the troposphere.  You might assume this is because the
Northern Hemisphere produces most of the man-made CO2, but scientists have
found that there is very good global mixing in the atmosphere, and CO2
concentrations are about the same wherever you measure them.  Part of the
explanation is probably due to the fact that temperatures are more stable in
the Southern hemisphere (since land heats and cools faster than ocean, and
there is much more ocean in the southern half of the globe), but the surface
temperature records do not show such a north-south differential.  At the
end of the day, nothing in AGW adequately explains this phenomenon.  (As
an aside, remember that AGW supporters write off the Medieval Warm Period
because it was merely a local phenomena in the Northern Hemisphere not observed
in the south "“ can't we apply the same logic to the late 20th
century based on this satellite data?)

An even more important problem is that the global
temperature increases shown here in the troposphere over the last several
decades have been lower than on the ground, exactly opposite of predictions
by AGW theory,

In 2006, David Pratt
put together a combined chart of temperature anomalies, comparing satellite
measurements of the troposphere with ground temperature measurements.  He
found, as shown in the chart below, but as you can see for yourself visually in
the satellite data, that surface warming is substantially higher over the last
25 years than warming of the troposphere.  In fact, the measured anomaly
by satellite (and by balloon, as we will see in a minute) is half or less than
the measured anomaly at the surface.

There are a couple of possible explanations for this
inconsistency.  One, of course, is that there is something other than
CO2-driven AGW that is at least partially driving recent global temperature
increases.  We will cover several such possibilities in a later chapter on
alternative theories.  One theory that probably does not explain
this differential is global dimming.  If anything, global dimming should
work the other way, cooling the ground vs. the troposphere.  Also, since
CO2 works globally but SO2 dims locally, one would expect more cooling effect
in the northern vs. the southern hemisphere, while actually the opposite is
observed.

Sat1

Another possible explanation, of course, is that one or the other
of these data sets has a measurement problem.  Take the satellite
data.  The measurement of global temperatures from space is a relatively
new art, and the scientists who compile the data set have been through a number
of iterations to their model for rolling the measurements into a reliable
global temperature (Christy just released version 6).  Changes over the
past years have actually increased some of the satellite measurements (the
difference between ground and surface used to be even greater).  However,
it is unlikely that the quality of satellite measurement is the entire reason
for the difference for the simple reason that troposphere measurement by
radiosonde weather balloons, a much older art, has reached very consistent
findings (if anything, they show even less temperature  increase since
1979).

A more likely explanation than troposphere measurement
problems is a measurement problem in the surface data.  Surface data is
measured at thousands of points, with instruments of varying types managed by
different authorities with varying standards.  For years, temperature
measurements have necessarily been located on land and usually near urban areas
in the northern hemisphere.  We have greatly increased this network over
time, but the changing mix of reporting stations adds its own complexity.

The most serious problem with land temperature data is from
urban heat islands.  Cities tend to heat their environment.  Black
asphalt absorbs heat, concrete covers vegetation, cars and power sources
produce heat.  The net effect is that a city is several degrees hotter
than its surroundings, an effect entirely different from AGW, and this effect
tends to increase over time as the city gets larger.   (Graphic
courtesy of Bruce Hall)

Climate scientists sometimes (GISS "“ yes, NOAA -- no)
attempt to correct measurements in urban areas for this effect, but this can be
chancy since the correction factors need to change over time, and no one really
knows exactly how large the factors need to be.   Some argue that the
land-based temperature set is biased too high, and some of the global warming
shown is in fact a result of the UHI effect.   

Anthony Watts
has done some great work surveying the problems with long-term temperature
measurement (some of which was obtained for this paper via Steve McIntyre's Climate Audit blog).
He has been collecting pictures of California measurement sites near his home,
and trying to correlate urban building around the measurement point with past
temperature trends.  More importantly, he has created an online database
at SurfaceStations.org where
these photos are being put online for all researchers to access.


The tennis courts and nearby condos were built in 1980, just
as temperature measurement here began going up.  Here is another, in
Marysville, CA, surrounded by asphalt and right next to where cars park with
hot radiators.  Air conditioners vent hot air right near the thermometer,
and you can see reflective glass and a cell tower that reflect heat on the
unit.  Oh, and the BBQ the firemen here use 3 times a week.


So how much of this warming is
from the addition of air conditioning exhaust, asphalt paving, a nearby
building, and car radiators, and how much is due to CO2.  No one
knows.  The more amazing thing is that AGW supporters haven't even tried
to answer this question for each station, and don't even seem to care. 

As of June 28, 2007, The
SurfaceStations.org documentation effort received a setback when the NOAA, upon
learning of this effort, removed surface station location information from
their web site. The only conclusion is that the NOAA did not want the shameful
condition of some of these sites to be publicized. 

I have seen sites like RealClimate arguing in their myth
busting segments that the global temperature models are based only on rural
measurements.  First, this can't be, because most rural areas did not have
measurement in the early 20th century, and many once-rural areas are
now urban.  Also, this would leave out huge swaths of the northern
hemisphere.  And while scientists do try to do this in the US and Europe
(with questionable success, as evidenced by the pictures above of sites that
are supposedly "rural"), it is a hopeless and impossible task in the rest of
the world.  There just was not any rural temperature measurement in China
in 1910.

Intriguingly, Gavin Schmidt, a lead researcher at NASA's
GISS, wrote
Anthony Watts
that criticism of the quality of these individual temperature
station measurements was irrelevant because GISS climate data does not relay on
individual station data, it relies on grid cell data.  Just as background,
the GISS has divided the world into grid cells, like a matrix (example below).

Unless I am missing something fundamental, this is an
incredibly disingenuous answer.  OK, the GISS data and climate models use
grid cell data, but this grid cell data is derived from ground measurement
stations.  So just because there is a statistical processing step between
"station data" and "grid cell data" does not mean that at its core, all the
climate models don't rely on station data.  All of these issues would be
easier to check of course if NASA's GISS, a publicly funded research
organization, would publicly release the actual temperature data it uses and
the specific details of the algorithms it uses to generate and smooth and
correct grid cell data.  But, like most all of climate science, they
don't.  Because they don't want people poking into it and criticizing
it.  Just incredible.

As a final note, for those that think something as seemingly
simple as consistent temperature measurement is easy, check out this theory
courtesy of Anthony
Watts

It seems that weather stations shelters known as Stevenson Screens (the
white chicken coop like boxes on stilts housing thermometers outdoors) were
originally painted with whitewash, which is a lime based paint, and reflective
of infra-red radiation, but its no longer available, and newer paints have been
used that [have] much different IR characteristics.

Why is this important? Well, paints that appear
"white" and reflective in visible light have different properties in
infrared. Some paints can even appear nearly "black" and absorb a LOT
of infrared, and thus bias the thermometer. So the repainting of thousands of
Stevenson screens worldwide with paints of uncertain infrared characteristics
was another bias that has crept into the instrumental temperature records.

After running this test, Watts actually ran an experiment comparing wood
that had been whitewashed vs. using modern white latex paint.  The
whitewashed wood was 5 degrees cooler than the modern latex painted wood.

Using Computer Models to Explain the Past

It is often argued by AGW supporters that because the
historic warming is so close to what the current global warming models say
historic temperatures should look like, and because the models are driven by
CO2 forcings, then CO2 must be causing the historic temperature increase.
We are going to spend a lot of time with models in the next chapter, but here
are a few thoughts to tide us over on this issue.

The implication here is that scientists carefully crafted
the models based on scientific theory and then ran the models, which nearly
precisely duplicated history.  Wrong.  In fact, when the models were
first built, scientists did exactly this.  And what they got looked
nothing like history.

So they tweaked and tuned, changing a constant here, adding
an effect (like sulfates) there, changing assumptions about natural forcings,
until the models matched history.  The models match history because they
were fiddled with until they matched history.  The models say CO2 caused
warming because they were built on the assumption that CO2 causes
warming.  So, unless one wants to make an incredibly circular argument,
the models are useless in determining how much CO2 affects history.  But
we'll get to a lot more on models in the next chapter.

The table of contents for the rest of this paper, . 4A Layman's Guide to Anthropogenic Global Warming (AGW) is here Free pdf of this Climate Skepticism paper is here and print version is sold at cost here

The open comment thread for this paper can be found here. 

A Question for Women's Groups

I don't have any particularly intelligent analysis of the SCOTUS's upholding the constitutionality of a partial birth abortion ban, so I won't offer any.

However, I have a question for women's groups.  Groups like NOW support the federal government's constitutional right to ban breast implants,and in fact call for such a ban on the NOW web site.  Simultaneously, they oppose the federal government's constitutional right to ban partial birth abortions.

My question is:  How can you reconcile these two views?  Aren't these two procedures similar enough (both are elective medical procedures that are invasive of a woman's body) to be Constitutionally identical?  I understand that from a social conservative's point of view that the abortion procedure might warrant more legal attention if you believe there is a second life (ie the fetus) involved here.  But how do you justify that the feds should have more power to regulate and ban boob jobs than they have to ban one type of abortion?  And please, don't justify it because you think abortion is serious but breast implants are frivolous  Those are legislative and political arguments about what should and should not be done with the fed's power, not Constitutional arguments about what that power actually is.

The women's groups' application of their "its our body" and "pro-choice" positions have always struck me as incredibly selective.  It's a woman's choice to weigh the risks and benefits of an abortion, but apparently it's the government's choice to weight the risks and benefits of breast implants.  I wrote more about this selective libertarianism when I made a plea for applying the privacy and choice logic of abortion supporters to all aspects of government regulation.  I criticized NOW for another instance of selective libertarianism associated with government and women's bodies when NOW supported having the government limit a woman's choice to use Vioxx to relieve pain.