Saturday, February 27, 2010

Dr. Richard Lindzen's Talk at Fermilab

Richard Lindzen PhD, the Alfred P. Sloan Professor of Meteorology in the Department of Earth, Atmospheric, and Planetary Sciences at the Massachusetts Institute of Technology, was recently invited to give a talk entitled "The Peculiar Issue of Global Warming" at Fermilab 2/10/10 which you can watch in its entirety with slides here. Dr. Lindzen calmly eviscerates the theory of catastrophic anthropogenic global warming (CAGW) and the IPCC "consensus". Highly recommended. Some of the key slides from the presentation are archived at the link below. Below are 3 slides from the presentation, the first noting that the theory of intelligent design sounds rigorous by comparison to the theory of anthropogenic global warming, the second noting that 3 pro-CAGW publications have already acknowledged that temperature data has contradicted the man-made attribution assumption (primarily CO2), which is the inherent assumption of the IPCC models, and the third noting that the fundamental assumption of CAGW that there is positive feedback by water vapor due to CO2 is "likely wrong".














Archive of some key slides from the talk
Related: Another talk by Dr. Lindzen at Oberlin College and at Rice University

Related: Another comparison of AGW to the theory of intelligent design

Lecture in 9 parts on YouTube:

1 of 9 Richard Lindzen The Peculiar Issue of Global Warming Fermilab Colloquium 2-10-2010.mp4





http://www.youtube.com/watch?v=gMkyjyk-VEk







2 of 9 Richard Lindzen The Peculiar Issue of Global Warming Fermilab Colloquium 2-10-2010.mp4



http://www.youtube.com/watch?v=-z5AeI2DUgM







3 of 9 Richard Lindzen The Peculiar Issue of Global Warming Fermilab Colloquium 2-10-2010.mp4



http://www.youtube.com/watch?v=QQqfRy1mjfc







4 of 9 Richard Lindzen The Peculiar Issue of Global Warming Fermilab Colloquium 2-10-2010.mp4



http://www.youtube.com/watch?v=7XnvYfgsSHk







5 of 9 Richard Lindzen The Peculiar Issue of Global Warming Fermilab Colloquium 2-10-2010.mp4



http://www.youtube.com/watch?v=UneiApsqb2g







6 of 9 Richard Lindzen The Peculiar Issue of Global Warming Fermilab Colloquium 2-10-2010.mp4



http://www.youtube.com/watch?v=_XOQhm45XJw







7 of 9 Richard Lindzen The Peculiar Issue of Global Warming Fermilab Colloquium 2-10-2010.mp4



http://www.youtube.com/watch?v=qcjiLZdc4Ns







8 of 9 Richard Lindzen The Peculiar Issue of Global Warming Fermilab Colloquium 2-10-2010.mp4



http://www.youtube.com/watch?v=GN3A14apN9A







9 of 9 Richard Lindzen The Peculiar Issue of Global Warming Fermilab Colloquium 2-10-2010.mp4







Sea Levels: Large Variances between Tide Gauges and Satellite Altimetry

Determining changes in global sea levels is an enormously complicated undertaking, with measurement error, calibration error, seasonal adjustments, and regional differences as four of the most significant problems to overcome. Different types of measurement achieve different results. For instance, Sea Level Expert Dr. Nils-Axel Mörner (see post just prior to this also) finds that careful analysis of historical tide gauge records (correcting for subsiding, tectonic shifts, etc) shows no significant global sea level rise during most of the 20th century, and also finds corroboration of this from the geologic and coral reef records in the field (if the sea level doesn't rise reefs have to grow laterally rather than vertically, etc.).


How about satellite altimetry measurements of sea levels? These have their own unique set of problems, including

  • Large divergences between GPS-corrected tide gauges and satellite altimetry at the same location (see below)

  • Use of two different satellites at different times, and two different altimeters "due to degradation in the original instrument" on TOPEX with different electronics and resultant measurement divergence

  • And factors mentioned in Nerem et al:

"Satellite altimetry is somewhat unique in that many adjustments must be made to the raw range measurements to account for atmospheric delays (ionosphere, troposphere), ocean tides, variations in wave height (which can bias how the altimeter measures sea level), and a variety of other effects. In addition, the sea level measurements can be affected by the method used to process the altimeter waveforms, and by the techniques and data used to compute the orbit of the satellite. Early releases of the satellite Geophysical Data Records (GDRs) often contain errors in the raw measurements, the measurement corrections, and the orbit estimates." Nerem et al also mentions other major problems such as drift in the TOPEX microwave radiometer, a change from the original TOPEX altimeter to the back-up altimeter in 1999 "due to degradation in the original instrument" which had "different electronics" from the original resulting in divergent measurements which had to be "corrected'.
What about the first point, that there are large variances between GPS-corrected tide gauges and satellite altimetry at the same location? Here are 2 graphs from the University of Colorado at Boulder Sea Level Change site:


TOPEX calibration







Jason-1 & Jason-2 calibration















From the site:

    The method of producing the tide gauge estimates of altimeter drift that we report here is described in detail by Mitchum (2000), and will not be discussed in full here. Briefly, the method works by creating an altimetric time series at a tide gauge location, and then differencing this time series with the tide gauge sea level time series. In this difference series, ocean signals common to both series largely cancel, leaving a time series that is dominated by the sum of the altimetric drift and the land motion at the tide gauge site. Making separate estimates of the land motion rates and combining the difference series from a large number of gauges globally results in a times series that is dominated by the altimeter drift. Since the difference series at separate time gauge locations have been shown to be nearly statistically independent (Mitchum, 1998), the final drift series has a variance much smaller than any of the individual series that go into it. Because of the relatively large number of degrees of freedom, this method outperforms calibrations from dedicated calibration sites, although it is only a relative calibration, meaning that it cannot determine any absolute bias. It can, however, detect change in a bias, either a drift or a step change. For the tide gauge calibrations, global mean sea level during the period of TOPEX-A operation is used as zero level for TOPEX-B, POSEIDON, and Jason.



    Ideally, one would want to include all of the available tide gauges in the calibration. A number of gauges, however, have a significant lag in reporting of records and are not available for the Jason calibration. On the other hand, some others do not extend backward through most of the T/P mission. We have restricted the ~100 available gauges to a set of 64 near real-time stations that span the majority of both the T/P and Jason missions, and will therefore provide a relatively consistent calibration for both.


As stated above "Since the difference series at separate time gauge locations have been shown to be nearly statistically independent (Mitchum, 1998), the final drift series has a variance much smaller than any of the individual series that go into it. Because of the relatively large number of degrees of freedom, this method outperforms calibrations from dedicated calibration sites, although it is only a relative calibration, meaning that it cannot determine any absolute bias." Looking at the individual GPS-corrected tide gauges in the two graphs above compared to the satellite altimetric measurement at the corresponding location shows very large divergences of up to 25mm at a given point in time. Yes, if you sum all the anomalies from the carefully selected subset of tide gauges compared to the satellite records it is statistically insignificant from zero, but the large variances on individual records suggests much more doubt in the accuracy of satellite altimetry and/or GPS-corrected tide gauges than is commonly held.



Considering the bias the blogosphere has identified with adjustments to the global thermometer records, similar biases might be present in the many adjustments made to the satellite altimetry data. Although these adjustments have been broadly described in Nerem et al, the raw satellite data and documentation of all the adjustments made is not available for independent analysis. As we have seen time and again inappropriate adjustments to the thermometer data, the raw satellite altimetery data and the hopefully fully-documented adjustments should be made publicly available for independent assessment and verification. It would also be useful to provide a list of the 64 tide gauges from the 100+ available which were selected for calibration of the satellite data, as well as the raw data and any adjustments made to those tide gauge measurements.




Related:


IPCC AR4 page on errors associated with satellite altimetry & tide gauges (note each pass of the TOPEX satellite only measures sea height to an accuracy of 80 mm at a 95% confidence level)

Background paper
poster showing unexplained tide gauge/satellite altimetry measurement divergences

Church & White paper 

poster on the oldest tide gauges of the southern ocean 

Basic Geology Part 3 Sea Level Rises during Interglacial Periods 

Sea Level Expert: "80% of us disagree with the IPCC"

according to an interview recorded 5 days ago with Sea Level Expert Dr. Nils-Axel Mörner. Dr. Mörner is the recently-retired head of the Paleogeophysics and Geodynamics department at Stockholm University in Sweden. He is past president (1999-2003) of the INQUA Commission on Sea Level Changes and Coastal Evolution, leader of the Maldives Sea Level Project. Dr. Mörner been studying the sea level and its effects on coastal areas for some 38 years. 


In the interview, Dr. Morner states that none of the 33 authors of the 2001 IPCC Chapter on sea levels was considered to be a sea level expert, that all 33 were from other disciplines and selected "due to loyalty" to the IPCC. Furthermore, Dr. Morner estimates that of the 300-400 individual scientists "in the sea-level [scientific] community", 80% of sea-level experts disagree with the IPCC conclusions regarding sea level rise.


I have emailed Dr. Morner for his reply to the critique by Nerem et al and any comments on the post here "Global Sea Level Decrease 2004-2010 Part 2"  and will post his reply here.



Graphs from a Chapter written by Dr. Morner in the book Encyclopedia of Coastal Science by Maurice L. Schwartz, editor, showing TOPEX satellite altimetry data with no long term trend prior to the slew of adjustments broadly outlined by Nerem et al:



John Daly also has a graph of the same period of TOPEX satellite data, which shows a linear trend of 0.9 mm/yr.



At some point in time, this data was greatly adjusted to show a rise of 3.2mm/year, an increase of over three-fold, which is the only data now available from the University of Colorado sea level site. There is no publicly available record of how and why these large changes were applied to the original satellite data. Inquiring minds want to know.

Friday, February 26, 2010

Push to Oversimplify at Climate Panel

Article showing the IPCC paleoclimate reconstructions are oversimplified and overstated as also shown in prior hockey schtick posts here and here.



From the Front Page of THE WALL STREET JOURNAL today 02/26/2010

[IPCC_SUB] 

The group expressed 'regret' last month for an erroneous projection in its influential 2007 climate report that the Himalayan glaciers could melt by 2035.


In the next few days, the world's leading authority on global warming plans to roll out a strategy to tackle a tough problem: restoring its own bruised reputation.



A months-long crisis at the Intergovernmental Panel on Climate Change has upended the world's perception of global warming, after hacked emails and other disclosures revealed deep divisions among scientists working with the United Nation-sponsored group. That has raised questions about the panel's objectivity in assessing one of today's most hotly debated scientific fields.



The problem stems from the IPCC's thorny mission: Take sophisticated and sometimes inconclusive science, and boil it down to usable advice for lawmakers. To meet that goal, scientists working with the IPCC say they sometimes faced institutional bias toward oversimplification, a Wall Street Journal examination shows. Read more at article link above.



Michael Mann is angry about this article

Thursday, February 25, 2010

Baby steps to a Mea Culpa?



[Trenberth sort-of goes public with his climategate email in graph at the header]



Scientists examine causes for lull in warming 

 
* Exact causes unknown for lack of warming from 1999-2008 
* The underlying reason for cold winter not known 
* Climate science in focus after email scandal, errors 


By Gerard Wynn and Alister Doyle LONDON/OSLO, Feb 25 (Reuters) - Climate scientists must do more to work out how exceptionally cold winters or a dip in world temperatures fit their theories of global warming, if they are to persuade an increasingly sceptical public. At stake is public belief that greenhouse gas emissions are warming the planet, and political momentum to act as governments struggle to agree a climate treaty which could direct trillions of dollars into renewable energy, away from fossil fuels. Public conviction of global warming's risks may have been undermined by an error in a U.N. panel report exaggerating the pace of melt of Himalayan glaciers and by the disclosure of hacked emails revealing scientists sniping at sceptics, who leapt on these as evidence of data fixing. Scientists said they must explain better how a freezing winter this year in parts of the northern hemisphere and a break in a rising trend in global temperatures since 1998 can happen when heat-trapping gases are pouring into the atmosphere. "There is a lack of consensus," said Kevin Trenberth, head of the Climate Analysis Section at the U.S. National Center for Atmospheric Research, on why global temperatures have not matched a peak set in 1998, or in 2005 according to one U.S. analysis. Part of the explanation could be a failure to account for rapid warming in parts of the Arctic, where sea ice had melted, and where there were fewer monitoring stations, he said. "I think we need better analysis of what's going on on a routine basis so that everyone, politicians and the general public, are informed about our current understanding of what is happening, more statements in a much quicker fashion instead of waiting for another six years for the next IPCC report." The latest, fourth Intergovernmental Panel on Climate Change (IPCC) report was published in 2007 and the next is due in 2014. The proportion of British adults who had no doubt climate change was happening had dropped in January to 31 percent from 44 percent in January 2009, an Ipsos MORI poll showed this week.



The Reference Frame analysis of this news item
HOTTEST DECADE ON RECORD The decade 2000-2009 was the hottest since 1850 as a result of warming through the 1980s and 1990s which has since peaked, says the World Meteorological Organisation. British Hadley Centre scientists said last year that there was no warming from 1999-2008, after allowing for extreme, natural weather patterns. Temperatures should have risen by a widely estimated 0.2 degrees Centigrade, given a build up of manmade greenhouse gases. "Solar might be one part of it," said the Hadley's Jeff Knight, adding that changes in the way data was gathered could be a factor, as well as shifts in the heat stored by oceans. The sun goes through phases in activity, and since 2001 has been in a downturn meaning it may have heated the earth a little less, scientists say. "We've not put our finger precisely on what has changed," Knight said. "(But) If you add all these things together ... there's nothing really there to challenge the idea that there's going to be large warming in the 21st century." [I titled this post beginning to the Mea Culpa?] Melting Arctic ice was evidence for continuing change, regardless of observed temperatures, said Stein Sandven, head of the Nansen Environmental and Remote Sensing Center in Norway. "The long-term change for the Arctic sea ice has been very consistent. It shows a decline over these (past) three decades especially in the summer. In the past 3-4 years Arctic sea ice has been below the average for the last 30 years." Rajendra Pachauri, chair of the IPCC, told Reuters that the IPCC stood by its 2007 findings that it is more than 90 percent certain that human activities are the main cause of global warming in the past 50 years. "I think the findings are overall very robust. We've made one stupid error on the Himalayan glaciers. I think that there is otherwise so much solid science." The IPCC wrongly predicted that Himalayan glaciers could vanish by 2035.



NATURAL CAUSES? One long-running doubter of the threat of climate change, Richard Lindzen, meteorologist at the Massachusetts Institute of Technology, said a lull in warming was unsurprising, given an earlier "obsessing about tenths of a degree" in the 1980s and early 1990s. The world warmed 0.7-0.8 degrees Celsius over the last century. Lindzen expected analysis to show in a few years' time that recent warming had natural causes. "It just fluctuates. I think the best explanation is the ocean. The timescale for ocean circulations can be decades." He dismissed recent ice melt over a short, 30-year record. Pachauri said that scientists had to unpick manmade global warming from natural influences -- such as the sun and cyclical weather patterns -- also dubbed "natural variability". "Natural variability is not magic, there is movement of energy around the climate system and we should be able to track it," said Trenberth. Trenberth attributed the cold winter to an extraordinary weather pattern not seen since 1977 which had curbed prevailing westerly winds across the northern hemisphere, and said that the underlying cause was "one we don't have answers to."

Wednesday, February 24, 2010

Another Analysis Confirms Greenhouse Effect of CO2 already Saturated

Unpublished paper well worth a read available today at climaterealists.com by mechanical engineer and heat transfer expert Dan Pangburn


"is a comprehensive discussion of the science relating to the Global Warming issue and includes a fairly simple model (on page 15) that accurately predicts all average global temperatures since 1895 including the recent decline.



I observed the many conflicting assertions regarding the existence and cause of Global Warming, particularly as to whether it was significantly contributed to by human activity.

This led to substantial curiosity as to the truth. As a result I have conducted research on the issue for thousands of hours for over three years and have determined that the belief that human activity has had a significant influence on global climate is a mistake.



Greenhouse Analogy



This may be how the mistake began. Incorrect conclusions may have been drawn from various observations and discoveries. Some of the discoveries and developments are..."


One of the conclusions pertinent to recent discussions and posts here is the corroboration using different methods that:
"at the present CO2 level, atmospheric carbon dioxide increase has no significant influence on [average global temperatures]"

Tuesday, February 23, 2010

Your Own IPCC Climate Computer

Hey kids, now you too can do your own climate modeling using the IPCC's own complex & sophisticated climate computer (97% of it). Just enter the starting CO2 level in parts per million and the ending and it will calculate how much global warming will occur!  Here


Warning: do not use without parental supervision. Climate computer uses only 97% of the IPCC model of total positive radiative forcing since only 97% of it comes from CO2: temperature anomaly=4.7*ln(ending CO2/starting CO2). Model only applies to the 20th and 21st centuries since temperature anomalies bear no relation to the Medieval Warming Period with much lower CO2 levels nor the geologic record with CO2 levels in the mid-1000s per million throughout ice ages.The "official" temperature anomaly calculated by this computer is not guaranteed to match 97% of the IPCC model nor reality.

Great Reply to a Fan of AGW

From The Viscount Monckton of Brenchley
  • I try to answer as many enquiries as I can from people who want to discuss “global warming”. I wrote this letter in reply to a “global warming” fanatic who, it is not unfair to say, had never actually thought about the superstition to which he subscribes. Perhaps this letter will make him think a little more and believe a little less.

Dear Enquirer, – Thank you for taking the trouble to write to me. If I may, I shall highlight various passages from your letter in bold face, and then respond to them seriatim in Roman face.


“I am not a climate scientist, and so I can only go by the overwhelming consensus amongst scientists that man-made climate change is occurring and that it poses a grave threat to humanity.”


First, science is not – repeat not – done by consensus. Aristotle, in codifying the dozen worst fallacies to which mankind is prone, described this one as the “head-count fallacy”, or, as the mediaeval schoolmen called it, the argumentum ad populum. Merely because many people say they believe a thing to be true, they do not necessarily believe it to be true and, even if they do, it need not necessarily be true. Abu Ali Ibn al-Haytham, the astronomer, mathematician and philosopher of science in 11th-century Iraq who is credited as the father of the scientific method, said this –
“The seeker after truth does not put his faith in any mere consensus, however broad and however venerable. Instead, he subjects what he has learned from it to scrutiny using his hard-won scientific knowledge, and he verifies for himself whether it is true. The road to the truth is long and hard, but that is the road we must follow.”



More recently, T.H. Huxley, in the famous debate in which he defeated Bishop Soapy Sam Wilberforce in Oxford on the question of evolution, put it this way –
“The improver of natural knowledge absolutely refuses to acknowledge authority, as such. For him, scepticism is the very highest of duties: blind faith the one unpardonable sin.”
Secondly, the “consensus” you speak of does not in fact exist. Schulte (2008) reported that, of 539 scientific papers dated January 2004-February 2007 that contained the search phrase “global climate change”, not one provided any evidence that any anthropogenic influence on any part of the climate would prove in any degree catastrophic. That, if you do science by consensus, is the consensus.
Thirdly, during the pre-Cambrian era CO2 concentration was 300,000 parts per million, or 30% of the atmosphere, 773 times the 388 parts per million (<0.04%) in today’s atmosphere. Yet at that time glaciers came and went, twice, at the Equator and at sea level. The appearance of glaciers in this way could not have happened if CO2 had the exaggerated warming effect, derived by modelling rather than by measurement, that the IPCC imagines.
Fourthly, although the IPCC says more than half the 0.5 C “global warming” since 1950 was manmade, in fact four-fifths of it, or 0.4 C°, is known by measurement to have been caused by a naturally-occurring reduction in cloud cover from 1983-2001 (Pinker et al., 2005).
Since the warming actually observed from 1983-2001 was also 0.4 C°, during that period of more than 18 years there appears to have been no contribution whatsoever from CO2 or from any other greenhouse gas, even though the IPCC’s central estimate is that the increase from 342 to 370 ppmv CO2 over the period ought to have caused a warming of 4.7 ln(370/342) = 0.4 C°. So there should have been a warming of 0.8 C° over the period, but there was not.
From observations such as these, it is not possible to place any faith in the IPCC’s values for climate sensitivity, which have always been highly speculative. In fact, global temperature since 1980 has been rising at half the rate originally predicted by the IPCC in its First Assessment Report in 1990, and, since 1995, has not risen in any statistically-significant sense at all.


“Certainly if you told me that there was a mere 1 in 100 chance that the plane on which my daughter was about to crash I would not let her take it, because the potential downside would be so disastrous.”



Here, with respect, you have misunderstood what is misleadingly called the “precautionary principle”, though it is not a principle at all. For it is essential – but all too often forgotten – to ensure that the precautions you adopt do not do more harm than whatever it is you are taking precautions against. To take your example, if your daughter was on a volcanic island that was erupting, and if the scientists told you that if she remained on the island she would certainly die, and that the only way to escape was on an aircraft with a 1:100 chance of crashing, you would surely let her take the plane.
Mutatis mutandis, one should look at the precautions that are being taken in the name of Saving The Planet from the non-problem that is “global warming”. The biofuel scam, for instance, has taken so much farmland out of growing food for people that need it that world food prices doubled in just two years, causing mass starvation, food riots and death in a dozen major regions of the world. These food riots went almost entirely unreported in the mainstream news media. In Haiti, long before the recent tragic earthquake, the poor had been reduced to eating mud pies made with real mud, and then the doubling of world food prices meant they could not even afford the mud pies. So they died, as millions of others pointlessly died, because self-indulgent environmentalists in the West were not willing to do the science more carefully, and were not willing to apply the precautionary principle to the precautions that they had themselves so enthusiastically but foolishly and cruelly advocated.


“I am writing to find out why the proposed solutions to the problem of climate change make you so angry.”


These “solutions” that you speak of, which would make not the slightest difference to the “problem” of “global warming” even if there were one, are killing my fellow-citizens of this planet by the million. Of course I am angry.


“Will the production of oil and gas not surely peak and decline, if not imminently then in 20 years’ time?”


A third of a century ago, the Club of Rome decided that by now there would be no oil or gas reserves left anywhere in the world. Yet, despite massive increases in consumption of oil and gas worldwide, proven reserves today are larger than they were 30 years ago. As the reserves become scarcer and more expensive to extract, the price will rise. As the price rises, so alternative sources of energy will become less unviable economically. There is no role for governments in trying to pick future winners in energy supply: the free market will do it better and more economically, and without the need for over-taxation or over-regulation on the part of the State.


“Are the prices of fossil fuels and in fact nearly all commodities not subject to increasing volatility year by year?”


No. In the 1970s the volatility in the oil price was many times greater than it is today. And any attempt to ban the use of fossil fuels would merely add greatly and unnecessarily to the price of all commodities. Yet it would make not the slightest difference to the climate, even if you believe the IPCC’s sevenfold exaggeration of the true (and negligible) warming effect of CO2. Shutting down the entire carbon economy of the world, which would amount to much the same thing as shutting down the entire world economy, with an unimaginable increase in the number of deaths already being caused worldwide by misguided policies in this field, would forestall just 1 C° of “global warming” every 41 years. The mathematics are not difficult: we emit 15 billion tons of CO2 per part per million by volume in the atmosphere, and we emit 30 billion tons a year at present worldwide, equivalent to an increase of 2 ppmv per year in atmospheric CO2 concentration. Shutting down the economy would stabilize CO2 concentration at today’s levels, preventing this 2 ppmv/year rise. So, given the 388 ppmv CO2 in the atmosphere at present, each year without any CO2 emissions at all would forestall 4.7 ln (390/388) C° of warming, or 1/41 C°. Now you will begin to see what I mean by counting the cost of the precautions. Shutting down the entire world’s carbon economy would kill billions, and cost trillions, and yet our instruments would hardly be able to measure the difference it made to the climate. And this, you will recall, is on the basis that the IPCC has not exaggerated the warming effect of CO2 as prodigiously as numerous peer-reviewed papers have demonstrated that it has.


“Are western trade deficits not reaching crippling levels, at least in part because of our addiction to and utterly profligate use of imported fossil fuels?”


No. Trade deficits tend to be a feature of Socialist administrations. During the Thatcher era, for instance, UK trade ran for much of the time at a surplus, particularly when errors in the official figures were accounted for (one could demonstrate these errors by adding up all of the published trade deficits and surpluses in the world, whereupon it appeared that there was a massive trade deficit with outer space). It is the right of individuals to decide whether they want to pay the price of imported oil, and they do. I bet you have at least one car, for instance. If you give it up, like me, and use a motor-cycle, you will do much to reduce both the trade deficit and the congestion on the roads.


“Are those imports not vastly enriching some of the worst regimes on the planet?” 


Yes. So give up your car. Give up electricity. Your letter was written electronically, for it has justified type, which a manual typewriter cannot achieve. Go back to the Stone Age, but remember not to light a carbon-emitting fire in your cave. Even if we all went back to the Stone Age (and that would involve the deaths of billions of our fellow-citizens), remember that we should forestall no more than 1 C° of “global warming” in 41 years – or, if I am right in suspecting that the IPCC has exaggerated CO2’s warming effect sevenfold, in almost 300 years.


“I cannot see how becoming much more efficient in our use of energy and other resources can be anything but a good thing.”


That, of course, is already happening, and by leaps and bounds. As energy prices rise, people learn to use energy more efficiently, to insulate their houses, to use their cars less often or use motor-bikes instead, and so forth. But, once again, there is no role for government here. The overhead cost of any government-driven activity doubles not only the cost of that activity but also the resource consumption associated with that activity. Milton Friedman won the Nobel Prize for Economics for explaining exactly why that multiple is so.
The fastest way to reduce resource consumption, including energy consumption, is to make government smaller. Yet I have never heard any environmentalist advocate that, for – as my good friend Eric Ellington, a founder of Greenpeace who died this week, told me on many occasions – Greenpeace and many other environmental groups founded by people who, like Eric, genuinely cared for the environment were rapidly taken over by totalitarians with not the slightest genuine interest in the environment, because they saw the environmental movement as a Trojan Horse that could be used to collapse the economies of the West from within.
I am against that Marxist/eco-Fascistic entryism, not least because the totalitarian countries have done far more damage to the environment than the nations of the free West. By all means use energy more efficiently, but don’t involve government in the process, or any benefits will be very heavily outweighed by the overhead cost of the extra bureaucrats and their lavishly-heated offices. “Keep it simple, stupid,” as the saying goes, and let the free market do what it does best – allocate resources in a far cheaper and more efficient manner than any totalitarian, however pious his intention.


“Similarly the development of alternatives to fossil fuels, such as anaerobic digesters that use food and farm waste to generate heat and electricity, or solar thermal panels on our rooftops to heat our bathwater.”


Anaerobic digesters are all very well on country estates and in large town houses with their own land, but they are impractical in most cities, and their cost often outweighs any environmental benefits. As for solar panels, the CO2 emitted in their manufacture easily exceeds any CO2 that will be saved in their relatively short lifetime. Maybe that will change one day, but that is and has long been the case. CO2 emissions don’t worry me, but, if they worry you, then forget solar panels.


“Is the decline in manufacturing in the West not a trend that can be reversed by the rise of a huge industrial-efficiency, clean-technology and renewable-energy industry?”


No. The decline of manufacturing in the West is chiefly attributable to the excessive cost of government and the intrusiveness of regulation. The last steelworks on Teesside has just closed because we have a no-doubt-piously-intended EU emissions-trading scheme which prices UK-made steel right out of world markets. If and when there is a market for the types of industry you mention, even then the cost of manufactures in those markets will almost certainly be cheaper almost anywhere else in the world than in the benighted, overtaxed, over-governed, over-regulated, over-pious EU. It is no accident that the last major manufacturer of wind-turbine blades in Britain has recently closed, precisely because the EU is no longer open for manufacturing business – thanks to carbon trading, which is a rigged market that favours those who rig it (governments and absolute bankers) at the expense of everyone and everything else, including the jobs of our workers.
“What about the aim of slowing and ultimately halting the clearance of the tropical rainforests on which surely all life depends? Is this not worth a global pact now, whether we believe in man-made climate change or not?”


A shame that you overstate your case here, because I agree that deforestation is undesirable and that reafforestation should be encouraged. I also agree this is one area where government, and even inter-governmental co-operation, has a role. But it is silly to say that all life depends on the tropical rain-forests. It doesn’t, though we should all be poorer for their passing.


“If I were to discover man-made climate change to be nothing more than an elaborate hoax created by the world’s scientific establishment, I would not reveal it, because all of the changes that its existence will bring about are changes which we absolutely must be making right away, irrespective of the climate.”


Since one of the “changes” that the climate scam has caused is the deaths of millions by starvation, I cannot and will not agree. Also, the lies and exaggerations peddled with such enthusiasm by the environmentalist movement will scarcely win it any friends when – as is already happening with great rapidity – the world wakes up to the fact that it has been lied to. In the end, it is always better to adhere absolutely to the truth, because no one heeds a proven liar, as the environmentalist movement will rapidly learn to its cost.
It is also questionable whether the “changes” you mention would be a good idea in their own right. There is a direct correlation between CO2 emissions per head and life expectancy, and a direct anti-correlation between CO2 emissions per head and child mortality, for instance. Also, CO2 is plant food: if we were able to double its current concentration, the yield of many staple food crops would increase by up to 40%. So curbing CO2 emissions, the principal objective of the environmentalist movement at present, would do great harm. In particular, it would prevent the cheapest and surest way of lifting people out of poverty: the burning of fossil fuels to generate electricity. Populations that become free of poverty stabilize themselves, while countries compelled to remain poor continue to suffer an excessively rapid birth-rate. The perverse but inevitable effect of any success that the environmentalist movement may have in trying to limit emissions of CO2 will be to increase world poverty, to increase world population, and ultimately to increase the world’s carbon footprint. I don’t mind about the carbon footprint, which would be harmless and beneficial, but increasing the world’s population by forcibly keeping it poor is surely cruel madness.


“I have yet to see you touch on the now virtually undisputed fact of ocean acidification by rising levels of carbon dioxide in the atmosphere, the consequences of which are predicted to be catastrophic.”


Ah, ocean “acidification”, the fall-back position of those who have realized that there has been no statistically-significant “global warming” for 15 years, and a rapid global cooling trend for nine of those 15 years. Once again, you overstate your case by talking of the “virtually undisputed fact” of ocean “acidification”. In fact, as you will find from the little book on the subject by my distinguished colleague Dr. Craig Idso of www.CO2science.org, the overwhelming opinion of scientists in the literature is that ocean “acidification” is a chemical impossibility.
So let us end this letter with a little science. First, with this as with all scientific subjects, we need a sense of proportion. So let us get a few things clear straight away. The acid-base balance of water is measured on a logarithmic scale of the proportion of hydrogen ions in the water. This proportion is labelled the pH. The pH of seawater is 7.9-8.2; the pH of neutral or pure water is 7.0; and the pH of rainwater is 5.4. Any value greater than 7.0 is alkaline; any value less than 7.0 is acid. So seawater is pronouncedly alkaline, and rainwater is pronouncedly acid.
Next, we need to gain some idea of the amount of CO2 in seawater. Seawater at the surface is 1100 times denser than the atmosphere, and it contains 70 times as much CO2 as the atmosphere. From this it is easy to see that the amount of CO2 in seawater is very, very small, for it is minuscule in the atmosphere. Now, suppose that we were to double the partial pressure of CO2 in the atmosphere. In accordance with Henry’s Law, about 30% of the extra CO2 in the atmosphere would end up in the oceans. Accordingly, the amount of CO2 in the oceans would increase by 1/233, or less than half of one per cent. Is that enough to “acidify” the oceans? Obviously not. At most, it would move the pH by an immeasurably small amount towards neutralization.
In fact, however, it would not even do that. Why? Because CO2 is only the seventh-most-prevalent of the substances dissolved in the oceans that can alter its acid-base balance. And CO2 has a special role as the buffer that preserves homoeostasis in the acid-base balance of the oceans. In the pre-Cambrian era, for instance, the CO2 in the oceans reacted with the superabundant calcium and magnesium ions in seawater to precipitate out dolomitic rock. As the partial pressure of CO2 fell, the CO2 reacted with calcium ions to form limestone or chalk (CaCO3), which contains 44% CO2.
In the Cambrian era, when CO2 concentration was 7500 parts per million, or around 18 times today’s, the first calcite corals achieved symbiosis with algae, then the only plant life on Earth. In the Jurasic era, when CO2 concentration was around 6500 parts per million, the first aragonite corals came into existence. For most of the past 750 million years, CO2 concentration in the oceans has been at least 1000 parts per million, compared with <400 today. Why were the oceans never acid throughout this time, notwithstanding the high partial pressures of CO2? Because the oceans run over rocks, and rocks are pronouncedly alkaline. As long as there are rocks beneath and around the oceans, the oceans cannot and will not acidify.
One final point about so-called “acidification”. Believe it or not, calcium carbonate shells and corals dissolve 15 times more readily in the strongly-alkaline seawater of today than they would in neutral water of pH 7.0. If it were possible to neutralize the oceans somewhat, shells would be less at risk of dissolving, not more at risk. I do not know how much chemistry you have, but I can show you the chemical equations underlying this topic, if you like. There was not and is not any sound scientific basis for believing that a little extra CO2 in the atmosphere will have any appreciable effect on the oceans.
Let me conclude by observing that it is not appropriate to try to politicize science. The environmentalist movement, by seeking to push the science beyond reason or reality in its attempt to frighten schoolchildren and the feeble-minded, will find that as the truth emerges the world will turn its back on those who have baselessly cried “Wolf!” many times too often. It will then be seen that nothing has done more harm to the cause of true environmental concern than the attempted capture of the environmental movement by people who care nothing for the environment and everything for the narrow, poisonous, politicized faction with which they unwisely identify themselves. Great is truth, as the Book of Life says, and mighty above all things.
Yours sincerely,
VISCOUNT MONCKTON OF BRENCHLEY

Nature Rules Climate: 2009 Paper & 8 Other Reasons

It would be nice if the climatologists would talk to the geologists and geophysicists  more often, although sadly the emails show little if any attempt to do so. They might learn a lot about the real physical evidence on what has been changing climate since the beginning of time, instead of just presuming that since CO2 has risen in the 20th century from 0.0300% to 0.0388% of the atmosphere [note man-made contributions are only 3-4% of that...so the man-made CO2 has risen in the 20th century from 0.0012% to 0.001552% of the atmosphere] that that must be the "missing link" needed in the models to explain climate change, and then arbitrarily assigning the CO2 "missing link" 97% of the total positive radiative forcing  in the computer models. Never-mind that:





  1. the greenhouse theory of positive feedback of radiative forcing of water vapor due to CO2 violates the 2nd law of thermodynamics

  2. the predicted tropical tropospheric hot spot this theory predicted never developed

  3. the water vapor positive feedback of radiative forcing due to CO2 has been shown to be incorrect and actually negative based on the actual data

  4. the models are vastly off track with the satellite data which shows sensitivity to CO2 far less than predicted [Lindzen's new pre-publication paper shows sensitivity of 0.3-1.2 degrees C for a doubling of CO2 concentration (which will take 234 years at the current rate) -much less than was assumed  & here]

  5. the actual increase in CO2 in the 20th century is highly in doubt and may be much less (and even if it is correct, the time to double CO2 concentrations at the current rate is 234 years).

  6. in 5 of the 6 major ice ages CO2 levels were higher than the present, up to 20+ times higher, yet did not warm the planet. Also note, CO2 LAGS temperature in ice core data by ~800 years. CO2 lags temperature changes primarily due to solubility in the oceans.

  7. probable explanation of #6 is that the greenhouse effect of CO2 is already effectively saturated at the present levels. (and here)

  8. according to more than 800 scientist's papers, the Medieval Warming Period globally was as hot or hotter than the present, entirely due to natural processes-  i.e. why should we presume this time it's any different?

  9. their models of the earths energy balance don't take into account ocean oscillations and that the oceans hold 98% of the earth's heat (there's more heat in the top 2.5m of the ocean than the entire atmosphere and the oceans cover ~70% of the earth's surface). 

Speaking of the oceans, ignored by the IPCC models, brings me to this 2009 paper published in the prestigious Journal of Geophysical Research:



Full Press Release and Abstract to Study:



Nature not man responsible for recent global warming


Three Australasian researchers have shown that natural forces are the dominant influence on climate, in a study just published in the highly-regarded Journal of Geophysical Research. According to this study little or none of the late 20th century global warming and cooling can be attributed to human activity.


The research, by Chris de Freitas, a climate scientist at the University of Auckland in New Zealand, John McLean (Melbourne) and Bob Carter (James Cook University), finds that the El Niño-Southern Oscillation (ENSO) is a key indicator of global atmospheric temperatures seven months later. As an additional influence, intermittent volcanic activity injects cooling aerosols into the atmosphere and produces significant cooling.


"The surge in global temperatures since 1977 can be attributed to a 1976 climate shift in the Pacific Ocean that made warming El Niño conditions more likely than they were over the previous 30 years and cooling La Niña conditions less likely" says corresponding author de Freitas.


"We have shown that internal global climate-system variability accounts for at least 80% of the observed global climate variation over the past half-century. It may even be more if the period of influence of major volcanoes can be more clearly identified and the corresponding data excluded from the analysis.”
Climate researchers have long been aware that ENSO events influence global temperature, for example causing a high temperature spike in 1998 and a subsequent fall as conditions moved to La Niña. It is also well known that volcanic activity has a cooling influence, and as is well documented by the effects of the 1991 Mount Pinatubo volcanic eruption.


The new paper draws these two strands of climate control together and shows, by demonstrating a strong relationship between the Southern Oscillation and lower-atmospheric temperature, that ENSO has been a major temperature influence since continuous measurement of lower-atmospheric temperature first began in 1958.
According to the three researchers, ENSO-related warming during El Niño conditions is caused by a stronger Hadley Cell circulation moving warm tropical air into the mid-latitudes. During La Niña conditions the Pacific Ocean is cooler and the Walker circulation, west to east in the upper atmosphere along the equator, dominates.


"When climate models failed to retrospectively produce the temperatures since 1950 the modellers added some estimated influences of carbon dioxide to make up the shortfall," says McLean.


"The IPCC acknowledges in its 4th Assessment Report that ENSO conditions cannot be predicted more than about 12 months ahead, so the output of climate models that could not predict ENSO conditions were being compared to temperatures during a period that was dominated by those influences. It's no wonder that model outputs have been so inaccurate, and it is clear that future modelling must incorporate the ENSO effect if it is to be meaningful."
Bob Carter, one of four scientists who has recently questioned the justification for the proposed Australian emissions trading scheme, says that this paper has significant consequences for public climate policy.


"The close relationship between ENSO and global temperature, as described in the paper, leaves little room for any warming driven by human carbon dioxide emissions. The available data indicate that future global temperatures will continue to change primarily in response to ENSO cycling, volcanic activity and solar changes.”


“Our paper confirms what many scientists already know: which is that no scientific justification exists for emissions regulation, and that, irrespective of the severity of the cuts proposed, ETS (emission trading scheme) will exert no measurable effect on future climate.”
--
McLean, J. D., C. R. de Freitas, and R. M. Carter (2009), Influence of the Southern Oscillation on tropospheric temperature, Journal of Geophysical Research, 114, D14104, doi:10.1029/2008JD011637.
This figure from the McLean et al (2009) research shows that mean monthly global temperature (MSU GTTA) corresponds in general terms with the Southern Oscillation Index (SOI) of seven months earlier. The SOI is a rough indicator of general atmospheric circulation and thus global climate change. The possible influence of the Rabaul volcanic eruption is shown.


Excerpted Abstract of the Paper appearing in the Journal of Geophysical Research:
Time series for the Southern Oscillation Index (SOI) and global tropospheric temperature anomalies (GTTA) are compared for the 1958−2008 period. GTTA are represented by data from satellite microwave sensing units (MSU) for the period 1980–2008 and from radiosondes (RATPAC) for 1958–2008. After the removal from the data set of short periods of temperature perturbation that relate to near-equator volcanic eruption, we use derivatives to document the presence of a 5- to 7-month delayed close relationship between SOI and GTTA. Change in SOI accounts for 72% of the variance in GTTA for the 29-year-long MSU record and 68% of the variance in GTTA for the longer 50-year RATPAC record. Because El Niño−Southern Oscillation is known to exercise a particularly strong influence in the tropics, we also compared the SOI with tropical temperature anomalies between 20°S and 20°N. The results showed that SOI accounted for 81% of the variance in tropospheric temperature anomalies in the tropics. Overall the results suggest that the Southern Oscillation exercises a consistently dominant influence on mean global temperature, with a maximum effect in the tropics, except for periods when equatorial volcanism causes ad hoc cooling. That mean global tropospheric temperature has for the last 50 years fallen and risen in close accord with the SOI of 5–7 months earlier shows the potential of natural forcing mechanisms to account for most of the temperature variation.
Received 16 December 2008; accepted 14 May 2009; published 23 July 2009. [End Abstract Excerpt]


Technical Note from co-authors of study - July 29, 2009
Not surprisingly, a storm has broken out over research saying human activities are not the main factor behind climate change. In an attempt to denigrate the work, claims have been made that the research fails to effectively detect trends in MGT. This is misleading and causes confusion, especially among those people who have not read the paper.
The paper by McLean et al does not analyse trends in MGT; rather, it examines the extent to which ENSO accounts for variation in MGT. The research concludes that MGT has for the last 50 years fallen and risen in close accord with the SOI of 5-7 months earlier and shows the potential of natural mechanisms to account for most of the temperature variation.
It is evident in this paper that ENSO (ocean-atmosphere heat exchange) is the primary driver of MGT (i.e. El Niños cause global warming and La Niñas cause global cooling). All other mechanisms are small in comparison. The reason may be due to Hadley circulation which is itself linked to changes in sea surface temperature (ocean heat supply) and the Walker Circulation, that is, ENSO. Hadley circulation is the main mechanism for moving the surplus of energy at near the equator to high latitudes and plays a key role in the general circulation of the atmosphere. Changes in Hadley circulation affects convection and thus atmospheric moisture content and cloud cover which may in turn affect net solar heating as well as the transfer of heat from Earth to space.
Those who claim correlation using derivatives (differences) removes a linear trend miss the point. McLean et al use this method to construct Figures 5 and 6. It should be noted that detrended data was used purely to establish the time lag between the Southern Oscillation Index (SOI) and MGT in Figures 5 and 6. This time lag was then used in Figure 7 to show that close correlation between trends in temperature and changes in the Southern Oscillation Index seven months previously.

Figure 7 presents the data in its original form; namely, data that is not detrended, but with the time shift in SOI obtained from the detrended data. If an underlying trend existed, it would have shown up in Figure 7. One would see the temperature line rising away from the SOI line if, for example, rising atmospheric carbon dioxide concentrations had a significant influence. There is little or no sign of this.
The results in Figure 7 clearly show that the SOI related variability in MGT is the major contribution to any trends that might exist, although the McLean et al study did not look for this. The key conclusion of the paper, therefore, is that MGT is determined in most part by atmospheric processes related to the Southern Oscillation.
For more on trends, recent work by Compo and Sardeshmukh (Climate Dynamics, 32:33-342, 2009) is illuminating. The abstract includes the statement: “Evidence is presented that the recent worldwide land warming has occurred largely in response to a worldwide warming of the oceans rather than as a direct response to increasing greenhouse gases (GHGs) over land.”




Please don't tell me anymore that "the debate is settled" because there is "overwhelming evidence" of anthropogenic climate change.


Related: Hockey Schtick post on a climate model including ocean oscillations and "sunspot integral" when combined correlates with temperature R^2=0.96. Note the correlation of CO2 to temperature is R^2=0.44.


Monday, February 22, 2010

"Cliff Notes" on the last 2 Million Years of Climate Change

Sunday, February 21, 2010

The IPCC's "Unequivocal Evidence" & clever tricks

I've done my best to photoshop-out the IPCC's version of "Mike's Nature Trick" from the Northern Hemisphere IPCC AR4 Paleoclimate reconstruction to simulate how it should have appeared, i.e. without the big black thick hockey stick line stuck on showing the faulty thermometer record (the "real temps": I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie from 1981 onwards) and from 1961 for Keith’s to hide the decline."- Phil Jones email discussing his paper in 1999).

It is scientifically-improper to graft a thermometer record to a paleoclimate (e.g. tree-rings) proxy for temperature because trees are not thermometers, they are not calibrated to thermometers, there are many more important variables than temperature that can affect tree rings such as precipitation, location, CO2 (plant food) levels, cloudiness, soil quality, solar activity, weather patterns, etc. for which there is no reliable means to sort out. Even if this was proper, how can you possibly decide where to put the thermometer record on the Y axis compared to tree rings? Could just as well put it starting at -1 degree anomaly - nobody knows. Paleoclimate proxies can only be used to compare relative temperatures. For instance, the Medieval Warming Period (MWP) compared to the 20th century, which by the IPCC's own reconstruction does not show a statistically significant difference between the MWP and the 1940s as shown in the photoshopped graph. The tree-ring proxies started to take a dive around 1960 from which they did not recover, so to deal with this so called "divergence problem" the tree-ring data was either truncated at 1960 in 3 of the proxies, or left in as shown in the graph as taking a dive after ~1950 from which they never recover. But they did their best to further hide the decline by putting the thick black Trick line over the decline and making it look like the proxies showed an upturn in the late 70's just like the faulty thermometer record:
So this is why Mike's Nature Trick was just a "clever way to solve a problem" to quote Michael Mann, explaining away this deception as typical 'scientist talk'. No mention that the problem to solve was that the paleoclimate data showed the opposite of what was needed for the "unequivocal evidence" of unprecedented and man-made global warming. If the trees weren't good thermometers after 1960, then why were they good thermometers at any other point in the reconstruction? The IPCC AR4 continues on in this fine tradition, not allowing anyone to look at their graphs without the trick applied.


more on the "divergence problem":



Craig Loehle in Climatic Change, Volume 94, Numbers 3-4 / June, 2009



Abstract  Tree rings provide a primary data source for reconstructing past climates, particularly over the past 1,000 years. However, divergence has been observed in twentieth century reconstructions. Divergence occurs when trees show a positive response to warming in the calibration period but a lesser or even negative response in recent decades. The mathematical implications of divergence for reconstructing climate are explored in this study. Divergence results either because of some unique environmental factor in recent decades, because trees reach an asymptotic maximum growth rate at some temperature, or because higher temperatures reduce tree growth. If trees show a nonlinear growth response, the result is to potentially truncate any historical temperatures higher than those in the calibration period, as well as to reduce the mean and range of reconstructed values compared to actual. This produces the divergence effect. This creates a cold bias in the reconstructed record and makes it impossible to make any statements about how warm recent decades are compared to historical periods. Some suggestions are made to overcome these problems.



and from one of the submissions to the UK Parliamentary Committee on climategate:



The Attempt to Present a "Nice Tidy Story" of Unprecedented 20th Century Warmth

2. The CRU emails, however, reveal that the authors of this material did not present a neutral view of the science. In particular, they downplayed the considerable uncertainty inherent in trying to approximate temperatures from proxy data over a 1000-year period, they suppressed contrary information, and they suppressed dissenting views in ways that made even their own colleagues uncomfortable. Thus, in one representative email written during the preparation of the TAR, Keith Briffa stated that "I know there is pressure to present a nice tidy story as regards 'apparent unprecedented warming in a thousand years or more in the proxy data' but in reality the situation is not quite so simple."[1] He went on to say that "I believe that the recent warmth was probably matched about 1000 years ago."[2] Similarly, another key researcher, Ed Cook, in a lengthy email bristling at the effort to eliminate the MWP, wrote that "I do find the dismissal of the Medieval Warm Period as a meaningful global event to be grossly premature and probably wrong."[3]

3. These concerns, however, were brushed aside in the final TAR. The TAR's version of the temperature record of the last 1000 years was based on the now infamous "hockey stick" study of Mann et al., a study that purported to show 1000 years of slightly declining global temperatures followed by a sharp increase in the 20th century. The hockey stick paper concluded that the 1990s were the warmest decade and 1998 was the warmest year in a millennium. The hockey stick graph was the single most important piece of information in the TAR. It was Figure 1 of the Summary For Policymakers of the TAR appearing on page 3, and it was widely relied on by advocates.[4]

4. Despite its prominence in the TAR, the hockey stick has now largely been discredited, with both the National Research Council ("NRC")[5] and the independent Wegman Report[6] rejecting confidence in the conclusion that the 1990s were the warmest decade and 1998 was the warmest year in a millennium. Although the hockey stick paper was cited in AR4, its significance was downplayed, and EPA did not cite the paper in the Endangerment Finding or TSD.

5. However, the same people who gave that paper such prominence in the TAR - despite the misgivings expressed internally within the group - continued to dominate paleoclimate research and were again the leading authors of the AR4 paleoclimate material. Indeed, perhaps stung by criticisms of the hockey stick and by the appearance of so-called "skeptics" who questioned the central conclusions of the TAR, the drafting of at least the paleoclimate chapter of AR4 became more of a political than a scientific process.[7]

6. Thus, the two coordinating lead authors of Chapter 6 of AR4, Jonathan Overpeck of the University of Arizona and Eystein Jansen of the University of Bergen in Norway, openly coached contributors to produce materials that would serve a public policy agenda. As just a few examples, the CRU emails show that Overpeck instructed his colleagues to make sure that text was "FOCUSED on only that science which is policy relevant" and that would support pre-conceived summary bullet points.[8] The pair also advised authors to include graphics that would be "compelling" and that the "sign of ultimate success" of a graphic would be that it was so compelling that it would be selected for use in the policymaker's summary.[9] They told authors to "pls DO please try hard to follow up on my advice" to only refer to the MWP and the Holocene Thermal Maximum in a "dismissive" way.[10] They expressed satisfaction with a graphic that described the MWP as heterogeneous - meaning that warming was not uniform on a planetary scale - not because it was accurate but because it read "much like a big hammer," driving home the point they wished to make.[11] Moreover, although the hockey stick could no longer be relied on as a principal source of authority, authors were instructed that "[w]e're hoping you guys can generate something compelling enough" for the summary material for policymakers, "something that will replace the hockey-stick with something even more compelling."[12] Yet new research that reexamined the data on which the IPCC relied has challenged the IPCC's dismissal of the MWP as non-heterogeneous, concluding that the IPCC's conclusion in this regard was, at least, "premature" and based on limited data.[13]

7. The examples of this type of behavior abound.

The "Trick" to "Hide the Warming"

8. Much attention has been placed on Jones' now-famous email in which he stated that "I've just completed Mike's Nature trick of adding in the real temps to each series for the last 20 years (ie from 1981 onwards) amd from 1961 for Keith's to hide the decline."[14] The trick he and Mann performed was to hide a decline in temperatures appearing in tree ring data in the latter part of the 20th century. Unless this trick were used, their multi-century proxy temperature reconstructions would show an embarrassing decline in temperatures at the end of the reconstruction, a decline that was not paralleled in the record of directly measured temperatures, which showed an increase. To hide the decline in the proxy data, Mann and then Jones grafted on actual temperature data to the end of their proxy reconstructions rather than using the same proxy data as had been used throughout the reconstruction.

9. This trick makes the graphic presentations of the proxy reconstructions misleading, since the effect is to make it seem as if the proxy data shows rising 20th century warming when it doesn't. But the real deception in the trick was in hiding what became known as the "divergence" problem. The accuracy of tree ring data as proxies for temperatures can only be confirmed by comparing the proxy temperatures yielded by the tree rings with temperatures directly measured during the period when direct temperature measurements could be made. If the proxy data are contradicted by actual data, as they are for a significant period of the time when direct temperature measurements exist, the accuracy of the proxy data over the entire period of the proxy reconstruction is called into question. Thus, the divergence problem undermined faith in the ability of the proxy reconstructions to provide conclusive or even meaningful information about paleoclimate temperature conditions, even as the IPCC was relying on these reconstructions to conclude that temperatures in the 20th century had reached unprecedented levels in the last 1000 years. As one email candidly said, "[t]he issue of why we dont show the proxy data for the last few decades (they dont show continued warming) but assume that they are valid for early warm periods needs to be explained."[15] These concerns, however, were given short shrift. Although divergence was discussed in AR4, the conclusion was reached that the results of the proxy temperature reconstructions remained valid and showed that 20th century warmth was likely unprecedented in 1000 years. If divergence was not a significant issue, however, one wonders why it was necessary to perform "tricks" to hide the problem.[16]

10. More importantly, after AR4 was issued, at least three studies have been published reanalyzing the data used in the proxy reconstructions cited in AR4, including two by authors whose reconstructions were used in AR4. These studies concluded that, in fact, the divergence problem makes the reconstructions unreliable.[17] According to one study, the divergence problem "serve(s) to impede a robust comparison of recent warming during the anthropogenic period with past natural climate episodes such as the Medieval Warm Period or MWP."[18] Another study found that the divergence problem makes it "impossible to make any statements about how warm recent decades are compared to historical periods."[19] Another concluded that the divergence problem "is of importance, as it limits the suitability of tree-ring data to reconstruct long-term climate fluctuations, particularly during periods that might have been as warm or even warmer than the late twentieth century."[20]

11. It would seem, therefore, that the IPCC should have been more cautious in dismissing the divergence problem. It would also seem that the IPCC may have understood that there was something to hide after all.