Sunday, June 26, 2011

When climate science was a science instead of a political grandstand

One year prior to James Hansen testifying to Congress in 1988 he was 99% certain that man-made global warming was occurring, a paper published in Nature indicated that global warming would cause negative feedback from increased cloud cover to cool the Earth by more than the alleged warming effect of CO2.

Buying carbon offsets to assuage your green guilt? Study says don't bother

Study: trees not cure for global warming



BY MARGARET MUNRO, POSTMEDIA NEWS  JUNE 18, 2011



Planting trees may help appease travellers' guilt about pumping carbon into the atmosphere.



But new research suggests it will do little to cool the planet, especially when trees are planted in Canada and other northern countries, says climatologist Alvaro Montenegro, at St. Francis Xavier University in Nova Scotia.



"There is no magic bullet" for global warming, says Montenegro, "and trees are certainly not going to be providing it."



He assessed the impact of replanting forests on crop and marginal lands with Environment Canada researcher Vivek Arora. Their study, published Sunday in Nature Geoscience, concludes "afforestation is not a substitute for reduced greenhouse-gas emissions."



The United Nations, environmental groups and carbon-offset companies are invested heavily in the idea that planting trees will help slow climate change and global warming. International authorities have long described "afforestation" as a key climate-change mitigation strategy.



But the study says the benefits of tree planting are "marginal" when it comes to stopping the planet from overheating.



Trees do suck carbon [dioxide] out of the air, but the study highlights that their dark leaves and needles also decrease the amount of solar radiation that gets reflected by the landscape, which has a warming effect.



Cropland - especially snow-covered cropland - has a cooling effect because it reflects a lot more solar energy than forests, the scientists say. This so-called "albedo effect" is important and needs to be incorporated into assessments of tree planting programs and projects, the researchers say.



Montenegro and Arora stress that planting forests has many benefits - trees provide habitat for wildlife and prevent soil erosion. And planting forests does help reduce atmospheric levels of carbon dioxide because carbon is locked into wood as trees grow.



But planting trees will have only a modest effect on the global temperature, according to their study, which used a sophisticated climate modelling system developed by Environment Canada. [see Top 10 Reasons Why Climate Model Predictions are False]

CO2 levels have risen at the same rate for past 18,000 years

While climate alarmists claim CO2 levels have risen at an unprecedented rate since the industrial revolution, extrapolation from ice core data shows the rate of rise has been steady since the peak of the last ice age about 18,000 years ago. This shows that man-made CO2 (~3% of total emissions) has had little effect on atmospheric levels of CO2, which instead are driven by outgassing from the oceans during interglacial periods; interglacial periods are driven by solar insolation, not CO2.




Atmospheric CO2 from 3 ice-core studies shown on vertical axis, thousands of years ago shown on horizontal axis. Linear extrapolation from the peak of the last ice age ~18,000 years ago shows that the rate of rise in CO2 has not changed over the past 18,000 years. Notations in red added. Graph source.

Saturday, June 25, 2011

The IPCC and Greenpeace: Renewable Outrage

The Economist  6/17/11



THE release of the full text of the Intergovernmental Panel on Climate Change’s Special Report on Renewable Energy this week has led to a new set of questions about the panel’s attitudes, probity and reliabilty: is it simply a sounding board for green activists? The answer is no—but that doesn’t mean it’s without serious problems. For what’s worst about the affair, and for comments by IPCC chair Rajendra Pachauri, scroll down to the lower bits of the post.



When the summary of the report was released last month (IPCC summaries, agreed line by line by governments at often quite fractious plenary meetings, come out before the report they are summarising, in part because the report may need a little tweaking to reflect the plenary’s summary judgements) it came with a press release proclaiming that the world could get 80% of its energy from renewables by 2050 if it just had the right policies and paid the right amount. This figure was subsequently trumpeted by those parts of the world’s press paying attention, which tended to be the parts that have readers keen on more environmental action.



The full report shows where the number came from, and that’s why its publication sparked a fuss. One of the report’s 11 chapters is an analysis of 164 previously published scenarios looking at the energy mix over the next four decades under various assumptions. The scenario which had the highest penetration of renewables put the total at 77% by 2050. The research involved was done by the German space-research institute, which has long worked on energy analysis, too; its experts were commissioned to do the work by Greenpeace, and a Greenpeace staff member with an engineering background, Sven Teske, was the scenario’s lead author when it was published in a couple of different forms in peer-reviewed journals. It has also been published, in bigger, glossier format, by Greenpeace itself under the grating and uncharacteristically fence-sitting title Energy [R]evolution.



Mr Teske was also one of the authors of the chapter of the IPCC report that looked at those 164 scenarios, and that chose Energy [R]evolution as one of four scenarios to explore in more detail. That, say critics, looks like a fix. And one with big consequences. That one scenario’s claim that the world could get call-it-80% of its energy from renewables managed, thanks to the press release, to shape perceptions of the report when it was originally released, making it look like a piece of renewables boosterism. Worse: who wrote the foreword to Greenpeace’s glossy publication of its scenario? Rajendra Pachauri, the chair of the IPCC. (Disclosure: at the request of IPCC authors, this avatar of Babbage chaired a debate on the summary of the special report when it was launched in May, and his brother is a “co-ordinating lead author” on the panel’s forthcoming “fifth assessment report”, though not in an area associated with renewable energy.)



Steve McIntyre, who runs a blog on which he tries to hold climate science to higher standards than he sees it holding itself, picked up all these IPCC/Greenpeace connections and posted on them angrily, calling for all involved to be sacked. “As a citizen,” he says, “I would like to know how much weight we can put on renewables as a big-footprint solution. Prior to the IPCC report, I was aware that Greenpeace—and WWF—had promoted high renewable scenarios. However, before placing any weight on them, the realism of these scenarios needs to be closely examined. IPCC has a mandate to provide hard information but did no critical evaluation of the Greenpeace scenario."



His desire for solid, honest answers is plainly one to be shared. But the authors of the IPCC chapter involved declined to evaluate the scenarios they looked at in terms of whether they thought they were plausible, let alone likely. Ottmar Edenhofer, a German economist who was one of those in overall charge of the report, gives the impression that he would have welcomed a more critical approach from his colleagues; but there is no mechanism by which the people in charge can force an author team to do more, or other, than it wants to. (The same goes for authors on the team, Mr Teske says; he was one of twelve authors on the relevant chapter, and over 120 authors overall, and had no peculiar Greenpeace lantern with which to bend them all to his will.)



read remainder at economist.com

The Facts About Fracking

The real risks of the shale gas revolution, and how to manage them



WSJ.com Review & Outlook 6/25/11



The U.S. is in the midst of an energy revolution, and we don't mean solar panels or wind turbines. A new gusher of natural gas from shale has the potential to transform U.S. energy production—that is, unless politicians, greens and the industry mess it up.



Only a decade ago Texas oil engineers hit upon the idea of combining two established technologies to release natural gas trapped in shale formations. Horizontal drilling—in which wells turn sideways after a certain depth—opens up big new production areas. Producers then use a 60-year-old technique called hydraulic fracturing—in which water, sand and chemicals are injected into the well at high pressure—to loosen the shale and release gas (and increasingly, oil).



***

The resulting boom is transforming America's energy landscape. As recently as 2000, shale gas was 1% of America's gas supplies; today it is 25%. Prior to the shale breakthrough, U.S. natural gas reserves were in decline, prices exceeded $15 per million British thermal units, and investors were building ports to import liquid natural gas. Today, proven reserves are the highest since 1971, prices have fallen close to $4 and ports are being retrofitted for LNG exports.



The shale boom is also reviving economically suffering parts of the country, while offering a new incentive for manufacturers to stay in the U.S. Pennsylvania's Department of Labor and Industry estimates fracking in the Marcellus shale formation, which stretches from upstate New York through West Virginia, has created 72,000 jobs in the Keystone State between the fourth quarter of 2009 and the first quarter of 2011.



The Bakken formation, along the Montana-North Dakota border, is thought to hold four billion barrels of oil (the biggest proven estimate outside Alaska), and the drilling boom helps explain North Dakota's unemployment rate of 3.2%, the nation's lowest.



All of this growth has inevitably attracted critics, notably environmentalists and their allies. They've launched a media and political assault on hydraulic fracturing, and their claims are raising public anxiety. So it's a useful moment to separate truth from fiction in the main allegations against the shale revolution.



• Fracking contaminates drinking water. One claim is that fracking creates cracks in rock formations that allow chemicals to leach into sources of fresh water. The problem with this argument is that the average shale formation is thousands of feet underground, while the average drinking well or aquifer is a few hundred feet deep. Separating the two is solid rock. This geological reality explains why EPA administrator Lisa Jackson, a determined enemy of fossil fuels, recently told Congress that there have been no "proven cases where the fracking process itself has affected water."







A second charge, based on a Duke University study, claims that fracking has polluted drinking water with methane gas. Methane is naturally occurring and isn't by itself harmful in drinking water, though it can explode at high concentrations. Duke authors Rob Jackson and Avner Vengosh have written that their research shows "the average methane concentration to be 17 times higher in water wells located within a kilometer of active drilling sites."



They failed to note that researchers sampled a mere 68 wells across Pennsylvania and New York—where more than 20,000 water wells are drilled annually. They had no baseline data and thus no way of knowing if methane concentrations were high prior to drilling. They also acknowledged that methane was detected in 85% of the wells they tested, regardless of drilling operations, and that they'd found no trace of fracking fluids in any wells.



The Duke study did spotlight a long-known and more legitimate concern: the possibility of leaky well casings at the top of a drilling site, from which methane might migrate to water supplies. As the BP Gulf of Mexico spill attests, proper well construction and maintenance are major issues in any type of drilling, and they ought to be the focus of industry standards and attention. But the risks are not unique to fracking, which has provided no unusual evidence of contamination.



• Fracking releases toxic or radioactive chemicals. The reality is that 99.5% of the fluid injected into fracture rock is water and sand. The chemicals range from the benign, such as citric acid (found in soda pop), to benzene. States like Wyoming and Pennsylvania require companies to publicly disclose their chemicals, Texas recently passed a similar law, and other states will follow.



Drillers must dispose of fracking fluids, and environmentalists charge that disposal sites also endanger drinking water, or that drillers deliberately discharge radioactive wastewater into streams. The latter accusation inspired the EPA to require that Pennsylvania test for radioactivity. States already have strict rules designed to keep waste water from groundwater, including liners in waste pits, and drillers are subject to stiff penalties for violations. Pennsylvania's tests showed radioactivity at or below normal levels.



• Fracking causes cancer. In Dish, Texas, Mayor Calvin Tillman caused a furor this year by announcing that he was quitting to move his sons away from "toxic" gases—such as cancer-causing benzene—from the town's 60 gas wells. State health officials investigated and determined that toxin levels in the majority of Dish residents were "similar to those measured in the general U.S. population." Residents with higher levels of benzene in their blood were smokers. (Cigarette smoke contains benzene.)



• Fracking causes earthquakes. It is possible that the deep underground injection of fracking fluids might cause seismic activity. But the same can be said of geothermal energy exploration, or projects to sequester carbon dioxide underground. Given the ubiquity of fracking without seismic impact, the risks would seem to be remote.



• Pollution from trucks. Drillers use trucks to haul sand, cement and fluids, and those certainly increase traffic congestion and pollution. We think the trade-off between these effects and economic development are for states and localities to judge, keeping in mind that externalities decrease as drillers become more efficient.



• Shale exploration is unregulated. Environmentalists claim fracking was "exempted" in 2005 from the federal Safe Water Drinking Act, thanks to industry lobbying. In truth, all U.S. companies must abide by federal water laws, and what the greens are really saying is that fracking should be singled out for special and unprecedented EPA oversight.



Most drilling operations—including fracking—have long been regulated by the states. Operators need permits to drill and are subject to inspections and reporting requirements. Many resource-rich states like Texas have detailed fracking rules, while states newer to drilling are developing these regulations.



As a regulatory model, consider Pennsylvania. Recently departed Governor Ed Rendell is a Democrat, and as the shale boom progressed he worked with industry and regulators to develop a flexible regulatory environment that could keep pace with a rapidly growing industry. As questions arose about well casings, for instance, Pennsylvania imposed new casing and performance requirements. The state has also increased fees for processing shale permits, which has allowed it to hire more inspectors and permitting staff.



New York, by contrast, has missed the shale play by imposing a moratorium on fracking. The new state Attorney General, Eric Schneiderman, recently sued the federal government to require an extensive environmental review of the entire Delaware River Basin. Meanwhile, the EPA is elbowing its way into the fracking debate, studying the impact on drinking water, animals and "environmental justice."



***

Amid this political scrutiny, the industry will have to take great drilling care while better making its public case. In this age of saturation media, a single serious example of water contamination could lead to a political panic that would jeopardize tens of billions of dollars of investment. The industry needs to establish best practices and blow the whistle on drillers that dodge the rules.



The question for the rest of us is whether we are serious about domestic energy production. All forms of energy have risks and environmental costs, not least wind (noise and dead birds and bats) and solar (vast expanses of land). Yet renewables are nowhere close to supplying enough energy, even with large subsidies, to maintain America's standard of living. The shale gas and oil boom is the result of U.S. business innovation and risk-taking. If we let the fear of undocumented pollution kill this boom, we will deserve our fate as a second-class industrial power.

Friday, June 24, 2011

Jay Leno mocks Al Gore's extremist views that nobody listens to anymore

From the Tonight Show with Jay Leno monologue 6/23/11, second & third jokes:



Today President Obama released 30 million barrels of oil from the Strategic Petroleum Reserve...he said it was in response to what he called a real emergency - his poll numbers.



Even Al Gore is attacking President Obama...Gore said Obama has failed to stand up for bold action and has made little progress on global warming...and then the girl said, "Sir, if you could just pay for your ice cream - we've got other customers waiting."



(applause & laughter)







Related: The Boy Who Cried Wolf

The White House Oil Epiphany

Obama has epiphany that the skyrocketing energy prices he called for previously are not good for re-election...



WSJ.com Review & Outlook 6/24/11



It wasn't long ago that the Obama Administration was trying to drive up the price of fossil fuels to reduce carbon emissions, promote "green jobs" and save the planet from global warming. Gasoline at $3.50 or $4 a gallon has ended that. And yesterday the White House went so far as to join a global effort to release 60 million barrels from oil stockpiles to further reduce prices.



The U.S. will release one million barrels a day for 30 days from the Strategic Petroleum Reserve—the nation's 727 million barrel oil stockpile located in salt domes in Texas and Louisiana. The spot price of oil dropped about $5 a barrel on the news, and if that decrease holds it could be the equivalent of a 10 cent a gallon reduction in gas prices.



The White House says it is taking this action because of "supply disruptions" in Libya and other countries which pose a threat to global economic recovery. But the Libyan conflict is now four months old, so Mr. Obama's falling approval ratings no doubt also provided motivation.



The SPR was created in 1975 to cushion the impact of major supply disruptions. George W. Bush drew on the reserves after Hurricane Katrina when domestic oil supplies from the Gulf of Mexico were curtailed. As a pure business decision, selling oil from the SPR when the price is high, and then replenishing the oil when the price falls, isn't a bad idea. But the effect on gas prices is temporary, as global supply and demand adjust.



One irony is that a million barrels a day is about how much oil experts believe we could be producing from the vast oil fields in Alaska's wildlife reserve. President Obama has said that tapping Alaska wouldn't affect oil prices but now says a temporary spurt will do so. How about opening up Alaska, and dropping the de facto Gulf moratorium too?

Wednesday, June 22, 2011

Study shows modern oceans are more alkaline than past 250 million years

While eco-alarmists would have you believe the oceans have "acidified" to dangerous pH levels, a paper published in Nature finds that the modern ocean pH of about 8.1-8.2 is actually the most alkaline the oceans have been over the past 250 million years. During this time corals, phytoplankton, and indeed most of the ocean biomass have evolved. The paper shows a mean pH of about 7.7 over the past 250 million years, whereas the alarmist and frequently incorrect IPCC predicts ocean pH will drop to 7.88 (~0.2 pH units) under a "business as usual" scenario by 2100.




Mean ocean surface pH shown in 2nd graph. Modern ocean pH of 8.1-8.2 is shown at left side of graph, right side of graph is 250 million years ago. pH above 7.0 is alkaline. Top graph shows diversity of various species of phytoplankton. Lower graph shows little change in calcification over the entire period.
The fact is modern sea life cope perfectly well with pH levels that vary by .4 pH units over a period of less than one year, as shown in this graph from the Monterey Bay Aquarium of incoming seawater:

Tuesday, June 21, 2011

Incredible! Mann uses upside down data again!

One day following release, Michael Mann's latest paper on sea levels has been determined to use upside down proxy data, despite the exact same error being pointed out by Steve McIntyre 3 years ago! Mann has never acknowledged this grave error and incredibly, continues to use the same trick.



From ClimateAudit.org:





Upside Down Mann Lives on in Kemp et al 2011



contributor AMac:


Yesterday, Kemp et al. 2011 was published in PNAS, relating sea-level variation to climate over the past 1,600 years (UPenn press release). Among the authors is Prof. Mann. (Kemp11 is downloadable from WUWT.) Figs. 2A and 4A are “Composite EIV global land plus ocean global temperature reconstruction, smoothed with a 30-year LOESS low-pass filter”. This is one of the multiproxy reconstructions in Mann et al. (2008, PNAS). The unsmoothed tracing appears as the black line labelled “Composite (with uncertainties)” in panel F of Fig. S6 of the “Supporting Information” supplement to Mann08 (dowonloadable from pnas.org).


This is one of the Mann08 reconstructions that made use of the four (actually three) uncalibratable Tiljander data series.


As scientist/blogger Gavin Schmidt has indicated, the early years of the EIV Global reconstruction rely heavily on Tiljander to pass its “validation” test: “…it’s worth pointing out that validation for the no-dendro/no-Tilj is quite sensitive to the required significance, for EIV NH Land+Ocean it goes back to 1500 for 95%, but 1300 for 94% and 1100 AD for 90%” (link). Also see RealClimate here (Gavin’s responses to comments 525, 529, and 531).


The dependence of the first two-thirds of the EIV recon on the inclusion of Tiljander’s data series isn’t mentioned in the text of Kemp11. Nor is it discussed in the SI, although it is an obvious and trivial explanation for the pre-1100 divergence noted in the SI’s Figures S3, S4, and S5.


Peer review appears to have been missing in action on this glaring shortcoming in Kemp11′s methodology.
More than anything, I am surprised by this zombie-like re-appearance of the Tiljander data series — nearly three years after the eruption of the controversy over their misuse as temperature proxies!

Top 10 Reasons Why Climate Model Predictions are False

The IPCC predictions of catastrophic global warming climate change are entirely based upon computer models programmed on the basis of unverified, and in most cases, false premises. Unlike any other area of science, climate computer model results are considered gospel without verification by empirical data or proper consideration of the huge uncertainties and limitations of modeling a chaotic system in which almost all of the variables are poorly understood. Climate science has become perverted to the point of considering models to supplant empirical data

“People underestimate the power of models. Observational evidence is not very useful,” adding, “Our approach is not entirely empirical.” John Mitchell, principal research scientist at the UK Met Office
somehow forgetting the scientific method, as succinctly stated by physicist Richard Feynman:

It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.
The recent empirical observations showing that the Sun is entering an exceptionally low period of activity were immediately 'countered' by the climate alarmist community trumpeting a computer model to supposedly prove negligible effect of a Grand Solar Minimum upon climate. This prompts a list of the top 10 reasons why climate model predictions are false:



1. Even the IPCC admits the climate models have not been verified by empirical observations to assess confidence. The fine print of the IPCC 2007 Report contains this admission:

Assessments of our relative confidence in climate projections from different models should ideally be based on a comprehensive set of observational tests that would allow us to quantify model errors in simulating a wide variety of climate statistics, including simulations of the mean climate and variability and of particular climate processes. 
In the final paragraph of this critical section of the AR4 WG1 Chapter 8 page 52 the IPCC states that 

a number of diagnostic tests [of the models] have been proposed, but few of them have been applied to the models currently in use. 
In fact, the models have performed poorly in comparison to observations, with global temperatures failing to even remain above the lower bound predicted by the IPCC, despite the steady rise in CO2 levels:

2. Furthermore, the IPCC even admits "isn't clear which [diagnostic] tests are critical" to verify and assess confidence in the models. The 2007 Report Chapter 8, page 52 states the diagnostic tests to assess confidence in feedbacks simulated by different models have "yet to be developed." In other words, the IPCC can't begin to make any assessment whatsoever of confidence of the models at the heart of the IPCC "consensus" on anthropogenic global warming.  If the IPCC is unable to verify and determine confidence in the models, no other climate modelling publication in climate science can rightfully make the claim that the models have been verified, or determine confidence limits on the results.





3. Of 16 climate forcings identified by the IPCC, only 2 are stated by the IPCC to have a "high level" of understanding (CO2 and other greenhouse gases). Most of the other forcings have a "low level" of understanding, with a few stated to be "low to medium." It is impossible to create a model with any validity without a high level of understanding of the effect of each of the input variables. The variables also interact in a chaotic manner, which by definition cannot be modeled. 





4. The 2 forcings claimed by the IPCC to have a "high level" of understanding (man-made CO2 and other greenhouse gases plus unproven positive feedbacks) are in fact not well understood, with empirical satellite data showing the sensitivity to doubled CO2 with feedbacks is only about 0.7C  (Lindzen & Choi 2009, 2011 and others), a factor of 4 less than assumed by IPCC climate models. 





5. The climate models falsely assume infrared "back-radiation" from greenhouse gases can heat the oceans (71% of the Earth surface area). In fact, IR wavelengths are only capable of penetrating the surface of the ocean a few microns (millionths of a meter), with all energy absorbed used up in the phase change of evaporation (which actually cools the sea surface), with no remaining energy to heat the ocean bulk. This fact alone completely invalidates the assumed radiative forcing from greenhouse gases incorporated in the models.




Long Wave Infrared from greenhouse gases has a wavelength of ~8-14 microns. Penetration depth into water shown on right scale.
6. In contrast to IR "back-radiation," visible and especially UV radiation from the Sun is capable of penetrating the oceans to a depth of several meters to heat the oceans. Solar UV activity varies by up to 10% over solar cycles, unlike the total solar irradiance (TSI), which only varies by 0.1%. The IPCC climate models only consider changes in TSI and ignore the large changes in solar UV which heat the oceans. Solar UV also affects ozone levels, which in turn have large poorly understood effects on climate.





7.  Clouds are one of the most important yet most poorly understood variables, with the IPCC not even certain whether clouds have a net warming or cooling effect. The empirical data show cloud albedo declined over past few decades and accounts for at least 3 times as much warming as greenhouse gases. Whether the cloud changes are due to the cosmic ray theory of Svensmark et al or not, this remains an unexplained huge factor not incorporated in the models. As pointed out by Dr. Roy Spencer, a mere 1-2 % change in global cloud cover alone can account for either global warming or cooling. The changes in cloud cover secondarily related to solar activity noted by Svensmark et al have an amplitude of about 4%:





8. Ocean oscillations, which can have a periodicity of up to 60 years (e.g. the Pacific Decadal Oscillation), and huge effects upon worldwide climate, are not incorporated in the climate models. Ocean oscillations alone could account for the warming of the latter 20th century that the IPCC chooses to ascribe to man-made CO2, while claiming there is no other explanation.





9. As well stated by solar physicist Dr. Nicola Scaffeta,

...the traditional climate models also fail to properly reconstruct the correct amplitudes of the climate oscillations that have clear solar/astronomical signature...Given the above, there is little hope that the traditional climate models correctly interpret climate change and nothing concerning the real climate can be inferred from them because from a false premise everything can be concluded.
10. The latest climate models continue to greatly exaggerate sensitivity to CO2 by 67%. Despite admitting this, the model authors were unwilling or unable to tweak the models to match observed temperatures, allowing the exaggerated effects of CO2 to remain in the world's most commonly used climate model. How hard could it have been to correct the sensitivity to CO2, given that the supposedly sophisticated models can be replicated with a small handful of arbitrary and artificially linear forcing factors on a laptop PC?

Monday, June 20, 2011

The Climate Tort is Finished

WSJ.com  Review & Outlook 6/21/11



Yesterday's other important Supreme Court decision came in a case that joined the green lobby and the trial bar, if that isn't redundant. The Court unanimously struck down one of the legal left's most destructive theories, and not a moment too soon.



In American Electric Power v. Connecticut, eight states and various other environmental activists sued a group of utilities, claiming that their carbon emissions were a "nuisance" under federal common law and that therefore the courts should set U.S. global warming policy. Yet this is a fundamentally political question, one the Constitution reserves to Congress and the executive, as Justice Ruth Bader Ginsburg wrote for the 8-0 majority.



The Court "remains mindful that it does not have creative power akin to that vested in Congress," Justice Ginsburg observed, in an all-too-rare vindication of legal restraint. "It is altogether fitting that Congress designated an expert agency, here, EPA, as best suited to serve as primary regulator of greenhouse gas emissions. The expert agency is surely better equipped to do the job than individual district judges issuing ad hoc, case-by-case injunctions. Federal judges lack the scientific, economic, and technological resources an agency can utilize in coping with issues of this order."



We'd go further and point out that Congress never granted the Environmental Protection Agency the power to regulate CO2. The EPA has merely asserted that power with an assist from the pure policy invention of the Court itself in 2006's 5-4 Mass. v. EPA ruling. Still, the fact that every Justice rejected the new climate tort theory, and that the opinion was delivered by the most liberal Justice, shows how abusive it really was.



The Court dismissed the case under the "political question doctrine," but we wish it had resolved the technical issue of Article III standing, which determines when a plaintiff has a right to sue. The Justices were split four to four, and thus did not rule; Justice Sonia Sotomayor recused herself because she heard the case on the Second Circuit. Yet standing is one of the few restraints on the power of the federal courts, and the litigants didn't have it by a mile here.



Under the traditional legal reading of standing, plaintiffs have to show that the defendants caused their injuries and that the courts can meaningfully redress those injuries. But climate change is a world-wide phenomenon for which the group of utilities barely contributed even under the most aggressive global warmist theories. And even if the courts shut down those plants tomorrow, it would have no effect whatsoever on atmospheric CO2 concentrations.



The climate tort is nonetheless finished, and the Court's decision should make it impossible to advance the same claims in state courts. Anyone who cares about the economy and the Constitutional balance of power can breathe a little easier.

Sunday, June 19, 2011

Solar Physicist Dr. C. de Jager predicts Grand Solar Minimum will last until 2100

Dr. Cornelis de Jager is a renowned Netherlands solar physicist, past General Secretary of the International Astronomical Union, and author of several peer-reviewed studies examining the solar influence upon climate. In response to the recent press release of three US studies indicating the Sun is entering a period of exceptionally low activity, Dr. de Jager references his publications of 2010 and prior indicating that this Grand Solar Minimum will be similar to the Maunder Minimum which caused the Little Ice Age, and prediction that this "deep minimum" will last until approximately the year 2100.

"The new episode is a deep minimum. It will look similar to the Maunder Minimum, which lasted from 1620 to 1720...This new Grand Minimum will last until approximately 2100."
A lecture by Dr. de Jager at UCAR shows that solar activity during the 20th century was at the highest levels of the past 900 years:

and shows solar UV activity (bottom graph below) was at the highest levels of the past 400 years in the latter portion of the 20th century: (UV is the most energetic portion of the solar spectrum, and varies much more than the Total Solar Irradiance (TSI). The IPCC and computer models only consider changes in TSI, ignoring the much more significant changes in UV)

and shows the amplication of solar variation via the cosmic ray theory of Svensmark et al:

leading to two possible mechanisms accounting for amplified solar effects upon the climate, neither of which is considered by the IPCC:



Recommended: Dr. de Jager's peer-reviewed paper Solar Activity and Its Influence on the Climate

Thursday, June 16, 2011

Oh...the Irony: Environmental concerns derail billions of dollars in solar energy projects

Spot The Tortoise?

Todd Woody, 06.08.11, 06:00 PM EDT 

Forbes Magazine dated June 27, 2011

More than $10 billion in solar projects are riding on the shell of an iconic desert reptile.



image

BrightSource broke ground on this 370MW solar plant. Cost: $2.2 billion. Tortoises: 700.



Last October BrightSource Energy began construction on the first large-scale solar thermal power plant to be built in the U.S. in two decades. After an arduous three-year environmental review, a $1.6 billion federal loan guarantee and more than a half-billion dollars in investment from the likes of Google (GOOG - news people ), Morgan Stanley ( MS - news -people ) and NRG Energy ( NRG - news people ), Interior Secretary Ken Salazar and then California Governor Arnold Schwarzenegger appeared at a sunny groundbreaking ceremony in Nipton, Calif., in the Mojave Desert. The 370-megawatt Ivanpah Solar Electric Generating System, they proclaimed, heralded a clean, green energy future.
But as the dignitaries speechified, biologists were discovering the creosote-bush-studded landscape was crawling with some uninvited guests: desert tortoises. Years of surveys had estimated that, at most, 32 of the iconic, imperiled animals called the 5.6-square-mile site home. But as giant road graders moved in, biologists had already found nearly that many tortoises just in the project's first, 914-acre phase.
"The big mystery question is, why are there more animals than expected?," said Mercy Vaughn, a respected desert tortoise biologist who's leading the company's roundup and relocation of the long-lived reptiles, as she stood outside a tortoise holding pen in October.
Today those pens have expanded to hold even more tortoises. Federal officials in April ordered construction temporarily halted on part of the project until a new environmental review could be conducted. The reason: Government biologists now predict that between 86 and 162 adult tortoises and 608 juveniles roam the site, some 40 miles southwest of Las Vegas. Biologists with the U.S. Bureau of Land Management, which leases the land to BrightSource, concluded that the project would "harass" 2,325 mostly juvenile tortoises living within a 2-kilometer radius outside the site in the Ivanpah Valley, where another company, First Solar ( FSLR - news people ), intends to construct two huge generating stations.
Wildlife has emerged as the wild card in plans to build more than a dozen multibillion-dollar solar projects in the desert Southwest. Earlier this year German developer Solar Millennium's U.S. venture abandoned a 250-megawatt solar project after 16 months of environmental review because of concerns over its impact on the Mohave ground squirrel. The renewed scrutiny of other big solar projects raises the stakes for the Obama Administration, which has offered more than $8 billion in loan guarantees for solar construction, and for developers and investors making bets on Big Solar.
read remainder at Forbes.com

New paper shows no increase in precipitation over past 105 years, counter to global warming theory

One of the central tenets of global warming theory is that warming of the atmosphere results in increased water vapor and thus precipitation, leading to alarmist predictions of increased flooding. A paper published online yesterday in the Journal of Geophysical Research counters this notion, showing that winter precipitation of the central Pacific coast has not increased over the past 105 years. Rather, a cyclical pattern of unknown etiology is found, which clearly shows no correlation to CO2 levels whatsoever.




Red horizontal line added to show zero anomaly level


The central Pacific Coast of the United States is one of the few regions in North America where precipitation exhibited a high proportion of variance at decadal time scales (10 to 20 years) during the last century. We use a network of tree ring-width records to estimate the behavior of the observed decadal pattern in regional winter precipitation during the last three and a half centuries. The pattern was most vigorous during the mid and late 20th century. Between A.D. 1650 and 1930, proxy estimates show a limited number of events separated by longer intervals of relatively low variance. The multicentennial perspective offered by tree rings indicates the energetic decadal pattern in winter precipitation is a relatively recent feature. Until a physical mechanism can be identified that explains the presence of this decadal rhythm, as well as its inconsistency during the period of record, we cannot rule out the possibility that this behavior may cease as abruptly as it began.

Dr. Nicola Scafetta: "there is little hope that...climate models correctly interpret climate change"

Dr. Nicola Scafetta, author of several peer-reviewed papers explaining the effects of solar activity on Earth's climate, comments on the paper the warmist media is referencing to attempt to dismiss the significance of the Sun on climate. Dr. Scafetta notes that the computer models of this paper and indeed most of climate science are programmed on the basis of false premises, stating, "there is little hope that the traditional climate models correctly interpret climate change and nothing concerning the real climate can be inferred from them because from a false premise everything can be concluded."





nicola scafetta 
Dr. Curry has referenced a work by Feulner G., Rahmstorf S. (2010), that uses a traditional climate model to evaluate the effect of the sun on the climate in the eventuality that a new longer solar minimum would occur. The conclusion is that the Sun would do little in any case.
The problem is whether the traditional climate model is correctly interpreting climate change. The only way to do that is to evaluate whether the climate model properly reconstructs the solar signature observed in the climate.
As I have extensively proven in my papers and by proponents of AGW (see for example Crowley, Science 2000), the traditional climate models produce a signature quite similar to the hockey stick graph by Mann which not only simply disagree with history but has also been seriously put in question under several studies.
Moreover, the traditional climate models also fail to properly reconstruct the correct amplitudes of the climate oscillations that have clear solar/astronomical signature.
Given the above, there is little hope that the traditional climate models correctly interpret climate change and nothing concerning the real climate can be inferred from them because from a false premise everything can be concluded.
In fact, the traditional climate models do not model several mechanisms that may contribute to a significant amplification of the solar impact on climate beginning from a cloud modulation from the cosmic rays which is solar induced.
Because of the lack of the current physics of climate change, the only way to correctly interpret climate is by phenomenological modeling the points to the direct simulation of the temperature patterns as I have proposed.
Once this is done, it is found that solar impact on climate is severely underestimated by the traditional models by a large factor while that the anthropogenic component has been overestimated by at least 2-3 times. That is, while the IPCC claims with the traditional models (which do not reconstruct the climate cycles) that more than 90% of the warming since 1850 is anthropogenic, the reality is very likely that no more than 30% of the warming may be anthropogenic and that this anthropogenic warming may not be GHG [greenhouse gas] induced because may also be UHI [urban heat island] induced, at least in part.
Thus, if the Sun will enter in a new prolonged period of minima, it is very unlikely that the global temperature will go up as predicted by the traditional climate models. It will go down as predicted by the models I have proposed in my papers (look at my web-page). For example “N. Scafetta, “Empirical evidence for a celestial origin of the climate oscillations and its implications”. Journal of Atmospheric and Solar-Terrestrial Physics 72, 951–970 (2010), and “Climate Change and Its Causes, A Discussion About Some Key Issues
(Also there is a new paper under press on these issues)

Wednesday, June 15, 2011

Climate Scientist Pielke Sr.: The oceans have not warmed for 7 1/2 years, defying James Hansen's predictions

2011 Update Of The Comparison Of Upper Ocean Heat Content Changes With The GISS Model Predictions

...
It has now been at least since 2003 that there has not been significant heating of the upper ocean.



“I do agree with you that several years of zero or little radiative imbalance poses some very difficult questions for the modeling community. But I do not think it is grounds for outright rejection of all model results.”
Joules resulting from a positive radiative imbalance must continue to be accumulated in order for global warming to occur. In the last 7 1/2 years there has been an absence of this heating. An important research question is how many more years of this lack of agreement with the GISS model (and other model) predictions must occur before there is wide recognition that the IPCC models have failed as skillful predictions of the effect of the radiative forcing of anthropogenic inputs of greenhouse gases and aerosols.
...


OBSERVED BEST ESTIMATE OF ACCUMULATION Of JOULES:
2003 ~0 Joules

2004 ~0 Joules

2005 ~0 Joules

2006 ~0 Joules

2007 ~0 Joules

2008 ~0 Joules

2009  ~0 Joules

2010 ~0 Joules

2011 ~0 Joules through May 2011
[JAMES] HANSEN PREDICTION OF The ACCUMULATION OF JOULES:
2003 ~0.67* 10** 22 Joules

2004 ~1.34* 10** 22 Joules

2005 ~2.01 * 10** 22 Joules

2006 ~2.68 * 10** 22 Joules

2007 ~3.35 * 10** 22 Joules

2008 ~4.02 * 10** 22 Joules

2009 ~4.69 * 10** 22 Joules

2010 ~5.36 * 10** 22 Joules

2011 ~6.03* 10** 22 Joules

2012 ~6.70* 10** 22 Joules



[Hansen's prediction is only off by 6700000000000000000000000 Joules]