Saturday, April 30, 2011

WSJ: Ethanol policies assure food prices will stay high for a long time

It's Getting Harder to Bring Home the Bacon 

WSJ.com 4/30/11



CEO of the world's largest pork producer explains why food prices are rising and why they are likely to stay high for a long time 





Excerpt: Some "60 to 70% of the cost of raising a hog is tied up in the grains," Mr. Pope explains. "The major ingredient is corn, and the secondary ingredient is soybean meal." Over the last several years, "the cost of corn has gone from a base of $2.40 a bushel to today at $7.40 a bushel, nearly triple what it was just a few years ago." Which means every product that uses corn has risen, too—including everything from "cereal to soft drinks" and more.



What triggered the upswing? In part: ethanol. President George W. Bush "came forward with—what do you call?—the edict that we were going to mandate 36 billion gallons of alternative fuels" by 2022, of which corn-based ethanol is "a substantial part." Companies that blend ethanol into fuel get a $5 billion annual tax credit, and there's a tariff to keep foreign producers out of the U.S. market. Now 40% of the corn crop is "directed to ethanol, which equals the amount that's going into livestock food," Mr. Pope calculates.



Related: Ethanol on the run - Ethanol is "fiscally indefensible" and "environmentally unwise"

Survival of the Fattest 

EPA: Ethanol harms air quality but we’re mandating it anyway 

Other posts on ethanol 

Friday, April 29, 2011

New material posted on the NIPCC website

Extreme Precipitation Events in China’s Zhujiang River Basin (26 Apr 2011)

Over the period 1961-2007, when climate alarmists claim the earth warmed at a rate and to a level that was unprecedented over the past millennium or more, the precipitation projections of the IPCC for this part of the world within the context of unprecedented global warming have still not come to pass ... Read More

\

Improving the Quantification of Oceanic DMS and DMSP (26 Apr 2011)



Obtaining accurate budget measurements “is the first step toward gaining a better understanding of key issues related to the DMS ocean-air interaction and the effect of phytoplankton DMS production on climate change” ... Read More




Rapid Adaptation to Potential Effects of Climatic Change Via Natural Selection (26 Apr 2011)



One can learn a lot from a mosquito fish ... Read More




B. nana Plants in the Arctic Tundra (26 Apr 2011)



Results of a new study indicate that “warming profoundly alters nutrient cycling in tundra, and may facilitate the expansion of B. nana through the formation of mycorrhizal networks of larger size.” ... Read More


A 265-Year Reconstruction of Lake Erie Water Level (26 Apr 2011)

The highest lake levels in the reconstruction are found over the past few decades ... Read More





Using Statistical Models to Understand Earth’s Climate: The Intertropical Convergence Zone (26 Apr 2011)



When computer models are mentioned in the context of studying weather and climate, most people think of General Circulation Models being used to predict weather or project climate into the future. Modeling can also be used to examine the lifecycle of various phenomena in our present climate. In the study of Bain et al. (2011), a statistical model is used in order to identify an important feature in the Earth’s atmosphere, a phenomenon that is thought to be relatively well-known ... Read More




The Climatic Impacts of Precipitating Ice and Snow (27 Apr 2011)



Exclusion of precipitating ice from climate models used in AR4 “can result in underestimates of the reflective shortwave flux at the top of the atmosphere (TOA) and overestimates of the down-welling surface shortwave and emitted TOA longwave flux,” which magnitude of these potential errors “is on the order of the radiative heating changes associated with a doubling of atmospheric carbon dioxide” ... Read More





Elevated CO2 Mitigates Negative Drought Effects on Barley Plants (27 Apr 2011)



Leaf water potential in plants subjected to drought, but grown at elevated CO2, was less negative than in their ambient CO2 grown counterparts ... Read More




Greenhouse Production of Cucumbers (27 Apr 2011)



Total season-long yield of a cucumber crop was increased by 19% by an extra 100 ppm of CO2 supplied to it during daylight hours, while overall water use efficiency of the CO2-enriched plants based on the amount of water supplied to them was about 40% higher ... Read More




Two-and-a-Half Millennia of European Climate Variability and Societal Responses (27 Apr 2011)



A new analysis of the subject yields some interesting observations; but how these observations are interpreted is the most crucial thing of all ... Read More


Effects of Elevated CO2 on Crop Water Relations (27 Apr 2011)

Real-world crops in Europe have been shown to use water more efficiently in high-CO2 air, thus making the liquid treasure last longer and possibly countermanding the current need to irrigate in many areas ... Read More



Changes in Hot Days and Heat Waves in China: 1961-2007 (27 Apr 2011)

The frequency of HDs and HWs in China has large spatial as well as temporal variability, due possibly to changes in regional atmospheric flow patterns and also changes in local weather pattern like cloud cover, rain etc. Future changes in HD and HW in China and also elsewhere will most certainly depend upon many local and regional features (cloud cover, rain/no rain) and atmospheric flow patterns and NOT on ‘human-activity induced’ warming alone ... Read More

Tuesday, April 26, 2011

New paper shows how natural ocean oscillations control climate

According to climate scientist Dr. Roger Pielke Sr., "A very important new paper has been accepted for publication in Climate Dynamics," titled Atlantic Multidecadal Oscillation And Northern Hemisphere’s Climate Variability. The paper shows how the climate of the Northern Hemisphere can be explained by a combination of the natural ocean cycles called the Atlantic Multidecadal Oscillation (AMO) and the Pacific Decadal Oscillation (PDO), without incorporating greenhouse gases. The graph below, from a poster associated with the paper, shows how the Northern Hemisphere (NH) surface temperature "can be nearly perfectly represented as a weighted sum of the AMO and PDO" natural ocean oscillations. IPCC models do not incorporate ocean oscillations and are purposely programmed to instead attempt to "prove" CO2 controls climate rather than natural factors such as ocean oscillations and solar variability.




NHT (blue line) is Northern Hemisphere Temperature and tracks "nearly perfectly" a weighted sum of the AMO and PDO ocean oscillations (red dashed line)



Poster associated with the paper, click for pdf file
Related: more on the natural ~ 60 year climate cycle

A very simple climate model in the post: Climate Modeling: Ocean Oscillations + Solar Activity R²=.96 also tracks global temperature nearly perfectly

Monday, April 25, 2011

Not So Fast: State of Washington Proposes $100 Annual Fee for Gas Tax Dodging EV Owners

Washington is looking to recoup lost revenue from EV drivers






Nissan Leaf
DailyTech 4/25/11: Owners of electric vehicles like the Nissan Leaf (100-mile driving range) and the Tesla Roadster (211-mile driving range) have the advantage of traveling on America's roads without having to spend a penny on gasoline. And even though the Chevrolet Volt uses a gasoline engine when its battery pack is exhausted, some drivers have managed to average 1,000 miles between gas stops.



The State of Washington, however, isn't too keen on EV drivers skirting the state's gas tax, which helps to maintain the roads that EV drivers travel on every day. According to the Associated Press, Washington has a $5 billion dollar deficit, and hitting the pockets of EV owners is just one way to help close the gap.



Washington's gas excise tax is one of the highest in the nation at 49.4 cents per gallon [PDF] -- 31 cents of the total is from the state, while the federal tax is 18.4 cents. Assuming that the average driver averages about 12,000 miles per year, a Nissan Leaf driver (EPA rated 99 mpg) would be only be skipping out on $38 of the state's portion of gasoline excise tax. For a Chevrolet Volt driver (EPA rated 93 mpg on battery power), the tax revenue lost by the state would amount to $40.



Washington's proposed EV fee, however, would amount to $100 per year.



"Electric vehicles put just as much wear and tear on our roads as gas vehicles,” explained the bill's sponsor, Democratic state Sen. Mary Margaret Haugen. "This simply ensures that they contribute their fair share to the upkeep of our roads."



"So the question is how do you account for those trends and begin to capture revenue that reflects the actual usage of the road?" said Republican state senator Dan Swecker. "Our state doesn't change very fast. But we thought the $100 fee was a place to start, so let's start there."



Not surprisingly, EV owners aren't exactly thrilled with this proposed legislation. "The Legislature saw electric vehicles are coming and thought, why not just put a fee on them," quipped Dean West, a Nissan Leaf driver.

Sunday, April 24, 2011

"We must end our addiction to economic growth"

but "where are the jobs?"





From the recent Energy Action Coalition’s Power Shift 2011 conference held in Washington, DC, a video of brainwashed 'climate justice' youth activists spouting clueless Hansen-esque nonsense. 





Remember the scary "single most-important finding in climate science last year": "a sharp decline" in phytoplankton? Never mind

The MSM was awash with alarmist reports last year that global warming had caused "a sharp drop" in ocean phytoplankton since the 1950's, with one newspaper stating "the single most-important finding in climate science last year was a 40 percent decline in the ocean's phytoplankton caused by global warming." A new paper published in Nature finds that multiple data sets instead show an increase in ocean phytoplankton over the past eight decades. Don't hold your breath for any retractions of the alarmist claims in the MSM, nor any stories reporting the good news.



Is there a decline in marine phytoplankton?



Nature  Volume: 472, Pages:  E6–E7  Date published: (14 April 2011)

Arising from D. G. Boyce, M. R. Lewis & B. Worm Nature 466, 591596 (2010); Boyce et al.

Phytoplankton account for approximately 50% of global primary production, form the trophic base of nearly all marine ecosystems, are fundamental in trophic energy transfer and have key roles in climate regulation, carbon sequestration and oxygen production. Boyce et al.1 compiled a chlorophyll index by combining in situ chlorophyll and Secchi disk depth measurements that spanned a more than 100-year time period and showed a decrease in marine phytoplankton biomass of approximately 1% of the global median per year over the past century. Eight decades of data on phytoplankton biomass collected in the North Atlantic by the Continuous Plankton Recorder (CPR) survey2, however, show an increase in an index of chlorophyll (Phytoplankton Colour Index) in both the Northeast and Northwest Atlantic basins3, 4, 5, 6, 7 (Fig. 1), and other long-term time series, including the Hawaii Ocean Time-series (HOT)8, the Bermuda Atlantic Time Series (BATS)8 and the California Cooperative Oceanic Fisheries Investigations (CalCOFI)9 also indicate increased phytoplankton biomass over the last 20–50 years. These findings, which were not discussed by Boyce et al.1, are not in accordance with their conclusions and illustrate the importance of using consistent observations when estimating long-term trends.

Saturday, April 23, 2011

NASA keeps mum on data that could disprove anthropogenic global warming theory

The theory of anthropogenic global warming is based upon the notion that increases in the minor greenhouse gas CO2 result in increases of the major greenhouse gas water vapor, thereby supposedly increasing global warming to alarming levels of 2-5C per doubling of CO2 levels. Without this assumed and unproven positive feedback from water vapor, there is no basis for alarm. While the IPCC confidently stated in their 2007 report,

“The average atmospheric water vapour content has increased since at least the 1980s over land and ocean as well as in the upper troposphere. The increase is broadly consistent with the extra water vapour that warmer air can hold.”
a 2005 paper based on the NASA water vapor data set [called NVAP] showed that water vapor levels had instead declined (with 95% confidence) between 1988-1999. The paper states,

“By examining the 12 year record [1988-1999], a decrease of TPW [total precipitable water vapor] at a rate of -0.29 mm / decade is observed. This relationship is significant at the 95 % but not at the 99 % level [since when do climate scientists insist on a 99% confidence level?]. A downward trend would be intriguing since there should be a positive slope if a global warming signal was present."
If the trend in water vapor is negative instead of positive, there is no positive feedback from water vapor and the theory of catastrophic anthropogenic global warming would be falsified. Climate scientist Dr. Roger Pielke Sr. notes that the NASA  findings "conflict with the conclusion of the 2007 IPCC report." NASA has not released an update of this extremely important NVAP water vapor data for the past 10 years and does not plan to release the data from 2001 through 2010 or the "reanalyzed" 1988-2001 data until "sometime in 2012 or 2013."  However, an online NASA NVAP annual report dated 3/15/11 shows the telling "PRELIMINARY RESULT, NOT FOR DISTRIBUTION" of a continued decline in atmospheric water vapor:




Lower series is over land & ocean, upper series is ocean only
While most NASA data is made available on the internet within a few months of collection and analysis, for some reason NASA NVAP water vapor data -which could either support or undermine the theory of catastrophic anthropogenic global warming- is not going to be officially released for up to 12 years since collection. Is it too much to ask that NASA finishes its analysis and releases this data before the world spends trillions on a potentially non-existent problem?






From a 2003 NASA NVAP poster showing water vapor anomalies over ocean through 2000




From the same 2003 poster
Global precipitation (an indicator of water vapor) is also not increasing as predicted by AGW theory


Related: Paper: Water vapor feedback is negative, not positive as assumed by IPCC alarmists



http://hockeyschtick.blogspot.com/2011/02/teleconference-will-attempt-to-explain.html

Why I Still Support Nuclear Power, Even After Fukushima

By WILLIAM TUCKER  WSJ.com  4/23/11



It's not easy being a supporter of nuclear energy these days. The events in Japan have confirmed many of the critics' worst predictions. We are way past Three Mile Island. It is not quite Chernobyl, but the possibilities of widespread radioactive contamination remain real.



Still, other energy technologies are not without risk. In 1944 a natural gas explosion in Cleveland leveled an entire neighborhood and killed 130 people. Yet we still pipe gas right into our homes. Coal mining killed 100,000 workers in the 20th century, and still kills an average of six a day in China, but we haven't given up coal. A hydroelectric dam collapsed in Japan during the earthquake, wiping away 1,800 homes and killing an undetermined number of people, yet nobody has paid much attention.



But talk about the risks of other energy sources really doesn't cut to the issue. The obvious question people are asking is, "Why do we have to mess with this nuclear stuff in the first place? Why do we have to risk these horrible accidents when other better technologies are available?" The answer is that there are no better alternatives available. If we are going to maintain our standard of living—or anything approximating it—without overwhelming the earth with pollution, we are going to have to master nuclear technology.



Consider: Uranium fuel rods sit in a reactor core for five years. During that time six ounces of their weight—six ounces!—will be completely transformed into energy. But the energy produced by that transformation will be enough to power a city the size of San Francisco for five years.



A coal plant must be fed by a 100-car freight train arriving every 30 hours. A nuclear reactor is refueled by a fleet of six trucks arriving once every two years. There are 283 coal mines in West Virginia and 449 in Kentucky. There are only 45 uranium mines in the entire world. Russia is offering to supply uranium to most of the developing world with the output from one mine. That is why the environmental impact of nuclear is infinitely smaller.



What about natural gas? Huge reservoirs of shale gas have been unlocked by hydrofracking. But "fracking" has been able to proceed so rapidly only because it has been exempted from federal regulations governing air and water pollution. Now that concern has arisen about damaged aquifers, natural gas production may slow as well.



So what about hydro, wind and solar? These energy sources will not bring about utopia. The only reason we don't object to the environmental effects of these renewables is because we haven't yet encountered them. 





The amount of energy that can be derived from harnessing wind or water is about 15 orders of magnitude less than what can be derived from uranium. Thus a hydroelectric dam such as Hoover must back up a 250-square-mile reservoir (Lake Mead) in order to generate the same electricity produced by a reactor on one square mile. 





Windmills require even more space, since air is less dense than water. Replacing just one of the two 1,000-megawatt reactors at Indian Point in Westchester County, N.Y., would require lining the Hudson River from New York to Albany with 45-story windmills one-quarter mile apart—and then they would generate electricity only about one-third of the time, when the wind is blowing. 





Solar collectors must be built to the same scale. It would take 20 square miles of highly polished mirrors or photovoltaic cells to equal the output of one nuclear reactor—and then only when the sun shines. Such facilities may one day provide supplementary power or peaking output during hot summer afternoons, but they will never be able to supply the uninterrupted flow of electricity required by an industrial society.





It will be impossible to meet the consumer demands of a contemporary society without a reliable source of energy like nuclear. Other countries have already acknowledged this. There are 65 reactors under construction around the world (far safer and more advanced than the 30-year-old technology at Fukushima Daiichi), but none in the U.S.



The Russians' sale of uranium to the world comes with an offer to take back the "nuclear waste" and reprocess it into more fuel, at a profit. The Chinese have commercialized their first Integral Fast Breeder, a reactor that can burn any kind of "waste" and promises unlimited quantities of cheap energy.



We have become the world's predominant industrial power because our forebears were willing to take the risks and make the sacrifices necessary to develop new technologies—the steam engine, coal mining, electricity, automobiles, airplanes, electronics, space travel. If we are not willing to take this next set of risks, others will. Then the torch will be passed to another generation that is not our own and our children and grandchildren will live with the consequences.



Mr. Tucker is author of "Terrestrial Energy: How Nuclear Power Will Lead the Green Revolution and End America's Energy Odyssey" (Bartleby Press, 2010).

Friday, April 22, 2011

Unsettled Science: Effects of clouds on climate still unknown

A press release today illustrates how climate science continues to struggle to understand the net effect of clouds upon global climate. IPCC climate models assume a slight net warming effect from clouds [personal communication with an atmospheric scientist/cloud specialist at JPL], yet the press release today states, "most clouds have a net cooling effect." Furthermore, the poorly-understood natural fluctuations in cloud cover could alone account for global warming or global cooling, as illustrated by Dr. Roy Spencer in his new book,

"The most obvious way for warming to be caused naturally is for small, natural fluctuations in the circulation patterns of the atmosphere and ocean to result in a 1% or 2% decrease in global cloud cover. Clouds are the Earth’s sunshade, and if cloud cover changes for any reason, you have global warming — or global cooling."
Until cloud effects are much better understood (as well as a host of other factors such as ocean oscillations), computer climate models will remain computer fantasy games.



Effect of Cloud-Scattered Sunlight on Earth's Energy Balance Depends on Wavelength of Light





Press Release 4/22/2011 2:25 PM EDT  Source: Pacific Northwest National Laboratory



Accounting for wavelength effects will likely improve climate models



RICHLAND, Wash. -- Atmospheric scientists trying to pin down how clouds curb the amount of sunlight available to warm the earth have found that it depends on the wavelength of sunlight being measured. This unexpected result will help researchers improve how they portray clouds in climate models.



Additionally, the researchers found that sunlight scattered by clouds — the reason why beachgoers can get sunburned on overcast days — is an important component of cloud contributions to the earth's energy balance. Capturing such contributions will increase the accuracy of climate models, the team from the Department of Energy's Pacific Northwest National Laboratory reported in Geophysical Research Letters earlier this month.



"The amount of the sun's energy that reaches the earth's surface is the main driver of the earth's temperature. Clouds are one of the least understood aspects of climate change. They can block the sun, but light can also bounce off one cloud into another cloud's shadow and increase the solar energy hitting earth," said PNNL atmospheric scientist Evgueni Kassianov.



Clouds both cool down and warm up the earth's surface. They cool the earth by reflecting some sunlight up into outer space, and they warm it by bouncing some sunlight down to the surface. Overall, most clouds have a net cooling effect, but atmospheric scientists need to accurately measure when they cool and warm to produce better climate models that incorporate clouds faithfully.



But it's a hard number to get. Fair-weather clouds are big puffy white objects that bounce a lot of light around. They can make the sky around them look brighter when they're there, but they float about and reform constantly. Cloud droplets and aerosol particles in the sky — tiny bits of dirt and water in the air that cause haziness — scatter light in three dimensions, even into cloud shadows.



To determine the net cloud effect, researchers need two numbers. First they need to measure the total amount of sunlight in a cloudy sky. Then they need to determine how bright that sky would be without the clouds, imagining that same sky to be blue and cloudless, when aerosols are in charge of a sky's brightness. The difference between those numbers is the net cloud effect.



Researchers have traditionally estimated the net cloud effect by measuring a broad spectrum of sunlight that makes it to the earth's surface, from ultraviolet to infrared. But clouds are white — that's because the large water droplets within them scatter light of all colors almost equally in the visible spectrum, the part of the electromagnetic spectrum that includes the colors of the rainbow.



On the other hand, aerosols — both within clouds and in the open sky — bounce different-colored light unequally. Broadband measurements that fail to distinguish color differences might be covering up important details, the researchers thought.



Instead of taking one broadband measurement that covers everything from ultraviolet to infrared, Kassianov and crew wanted to determine how individual wavelengths contribute to the net cloud effect. To do so, the team used an instrument that can measure brightness at four different wavelengths of color — violet, green, orange, red — and two of infrared.



In addition, this instrument, a spectral radiometer at DOE's Atmospheric Radiation Measurement Climate Research Facility located on the southern Great Plains in Oklahoma, allowed the team to calculate what the brightness would be if the day sported a cloudless, blue sky. The spectral measurements taken by the radiometer can be converted into the amount and properties of aerosols. Then aerosol properties can be used to calculate clear blue sky brightness.



Clouds Gone Wild



Comparing measured values for cloudy sky to the calculated values for clear sky, the researchers found that, on average, puffy fair-weather clouds cool down the earth's surface by several percent on a summer day. Although clouds cool overall, two components that the researchers looked at — from direct and scattered sunlight — had opposite effects.



The direct component accounts for the shade provided by clouds and cools the earth. The second component accounts for the sunlight scattered between and under clouds, which makes the sky brighter, warming the earth.



"The sunlight scattered by clouds can heat the surface," said Kassianov. "We all know that we can still get sunburned on cloudy days. This explains why."



In the Oklahoma summer, the scattered-light effect measured by the researchers could be quite large. For example, if a cloud passed over the instrument, the measured cloudy sky brightness exceeded calculated clear sky value by up to 30 percent. Kassianov attributes that large difference to scattered sunlight being "caught on tape" by the radiometer.



"Sunlight scattered by three-dimensional, irregular clouds is responsible for the observed large difference. The one-dimensional cloud simulations currently used in large-scale climate models don't capture this diffuse light," said Kassianov.



Aerosols' Day in the Sky



The team also found that the effect changed depending on the measured visible-spectrum wavelength, and whether the light was direct or scattered.



With direct light, the cooling caused by clouds was weakest on the violet end of the spectrum and strongest at infrared. With scattered light, warming caused by clouds was also weakest at violet and the strongest at infrared. Overall, the least cooling and warming occurred at violet, and the most cooling and warming occurred at infrared.



Because large droplets in clouds scatter sunlight almost uniformly across the spectrum, the clouds themselves can't be the reason why different wavelengths contribute differently to the net cloud effect. Compared to cloud droplets, aerosols are more than 100 times smaller and scatter wavelengths differently. These results suggest that aerosols — which not only cause haziness but contribute to cloud formation as well — are responsible for the wavelength differences, something researchers need to be aware of as they study clouds in the sky.



"If you want to study how aerosols and clouds interact," said Kassianov, "you need to look in the region of the spectrum where aerosol effects are significant. If you want to fish, you go where the fish are biting."



Reference: Kassianov E., Barnard J., Berg L.K., Long C.N., and C. Flynn, Shortwave Spectral Radiative Forcing of Cumulus Clouds from Surface Observations, Geophys Res Lett, April 2, 2011, DOI 10.1029/2010GL046282 (http://www.agu.org/pubs/crossref/2011/2010GL046282.shtml).

New material posted on the NIPCC website

The Importance of the Oceans and Topography in Climate Simulations (19 Apr 2011)

Wilson et al. (2010) use a coupled atmosphere-ocean general circulation model to examine the impact of mountains and the oceans in simulating the regular occurrence of our planet’s cyclonic storm systems. They find that in order to achieve more faithful model simulations of climate, a dynamic ocean and the proper representation of topography are crucial ... Read More


Growth Response of Radish to Super Atmospheric CO2 Enrichment (19 Apr 2011)

How high can the plant’s growth rate possibly rise??? ... Read More


The Climatic History of the European North Atlantic Seaboard (19 Apr 2011)

“Since tree-limits in Scandinavia or elsewhere in the world have not reestablished at their Medieval levels, it is still possible that today’s climate, despite centennial net warming, is within its natural limits” ... Read More


Central Pacific El Niño Events (19 Apr 2011)

The authors of this study conclude that “we cannot exclude the possibility that an increasing of occurrence frequency of CP El Niño during recent decades in the observation could be a part of natural variability in the tropical climate system” ... Read More


A New-and-Improved 457-Year History of ENSO Variability (19 Apr 2011)

Contrary to the projections of many climate models that have been made over a period of many years, Braganza et al. were unable to discern any unusual behavior in ENSO activity during the transition from what was the coldest period of the current interglacial to just before what the world’s climate alarmists claim was the warmest period of the past two millennia, which finding raises further questions about the validity of the model projections ... Read More


Nitrous Oxide (N2O) Fluxes from Temperate Grasslands in a Warmer, Wetter and CO2-Enriched World (19 Apr 2011)

How do the three oft-predicted environmental changes impact natural emissions of the powerful greenhouse gas?. ... Read More


Regional Climate Change: How Well to the IPCC Models Really Perform? (20 Apr 2011)

It is well known that climate models have difficulty in not only projecting future climate, but in replicating the past climate. In addition to inadequate model physics or a lack of data, there is a naturally inherent uncertainty in models that fluid dynamicists generically call chaos. One way to overcome these problems is to run the models many times and generate a range of possibilities called scenarios. Such a strategy would be useful if models could replicate climate adequately. Anagnostopoulos et al. (2010) demonstrate using a statistical approach that dynamic modeling alone should not be used to project future climate ... Read More


Catastrophic Superstorms of the French Mediterranean Coast (20 Apr 2011)

The two periods of most frequent superstorm strikes in the Aigues-Mortes Gulf coincide with two of the coldest periods in Europe during the late Holocene ... Read More


Aquatic Herbivores in a CO2-Enriched World of the Future (20 Apr 2011)

Will they fare as well as they do today? ... Read More


The Impact of Warming on Fungal Epidemics in Lakes (20 Apr 2011)

The authors’ findings present “a scenario that runs counter to the general expectation of a ‘warmer hence sicker world’” ... Read More


A Twentieth-Century Rainfall History of India (20 Apr 2011)

Contrary to the implications of global climate models employed by the IPCC, the global warming of the past century has not led to any significant concomitant change in the mean annual rainfall of all of India or that of any of its four sub-regions ... Read More


How a Long-Term CO2-Induced Increase in Forest Productivity is Maintained on a Nitrogen-Impoverished Soil (20 Apr 2011)

The key to the phenomenon may reside in the type of fungi colonizing the trees’ roots ... Read More

Wednesday, April 20, 2011

New paper affirms Sun was particularly active in late 20th century

A paper published today in The Journal of Geophysical Research: Space Physics confirms prior studies showing that solar activity was particularly active in the late 20th century. Another recent and related paper by the same lead author Mike Lockwood states,

45 Analysis of cosmogenic isotope abundances in terrestrial reservoirs, after removal of

46 complicating factors, such as the variability of the shielding afforded by the

47 geomagnetic field, reveal the effect of the Sun in reducing the fluxes of galactic

48 cosmic rays (GCRs) reaching the Earth. Because this solar shielding is known to vary

49 with the strength and structure of the heliospheric magnetic field, both of which are

50 modulated on both decadal and centennial timescales by solar activity, cosmogenic

51 isotopes give us an unique insight into solar variability on millennial timescales. Such

52 analyses all indicate that the Sun has been unusually active over recent decades

53 (Solanki et al. 2004; Vonmoos et al. 2006; Muscheler et al. 2007; Steinhilber et al.

54 2008). Solanki et al. (2004) used the 14C isotope abundance found in tree trunk and

55 concluded that the Sun has been more active recently than at any time in the previous

56 8000 years and that it was as active as in recent decades for only 10% of the past

57 11000 years...
The first part of the Lockwood et al paper excerpt above refers to the cosmic ray theory of Svensmark et al, which explains how small changes in solar activity can cause amplified effects upon climate via cloud formation. The new paper by Lockwood et al finds that solar activity during the Maunder minimum (corresponding to the Little Ice Age) was exceptionally low (vs. prior estimates) and that solar activity markedly increased up to the current grand solar maximum in ~1986. There is an associated lag time of solar activity vs. global temperature of approximately 10 years, noted here, which closely corresponds to the peak in global temperature of 1997-1998 (along with the effects of the record 1997-1998 El Nino).



JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 116, A04109, 12 PP., 2011





Centennial changes in the heliospheric magnetic field and open solar flux: The consensus view from geomagnetic data and cosmogenic isotopes and its implications



M. Lockwood, Space Environment Physics Group, Department of Meteorology, University of Reading, Reading, UK, Space Science and Technology Department, Rutherford Appleton Laboratory, Chilton, UK



M. J. Owens, Space Environment Physics Group, Department of Meteorology, University of Reading, Reading, UK



Abstract: Svalgaard and Cliver (2010) recently reported a consensus between the various reconstructions of the heliospheric field over recent centuries. This is a significant development because, individually, each has uncertainties introduced by instrument calibration drifts, limited numbers of observatories, and the strength of the correlations employed. However, taken collectively, a consistent picture is emerging. We here show that this consensus extends to more data sets and methods than reported by Svalgaard and Cliver, including that used by Lockwood et al. (1999), when their algorithm is used to predict the heliospheric field rather than the open solar flux. One area where there is still some debate relates to the existence and meaning of a floor value to the heliospheric field. From cosmogenic isotope abundances, Steinhilber et al. (2010) have recently deduced that the near-Earth IMF at the end of the Maunder minimum was 1.80 ± 0.59 nT which is considerably lower than the revised floor of 4nT proposed by Svalgaard and Cliver. We here combine cosmogenic and geomagnetic reconstructions and modern observations (with allowance for the effect of solar wind speed and structure on the near-Earth data) to derive an estimate for the open solar flux of (0.48 ± 0.29) × 1014 Wb at the end of the Maunder minimum. By way of comparison, the largest and smallest annual means recorded by instruments in space between 1965 and 2010 are 5.75 × 1014 Wb and 1.37 × 1014 Wb, respectively, set in 1982 and 2009, and the maximum of the 11 year running means was 4.38 × 1014 Wb in 1986. Hence the average open solar flux during the Maunder minimum is found to have been 11% of its peak value during the recent grand solar maximum.



Related: The Ever Changing Sun

Gallup Poll: Fewer Americans, Europeans View Global Warming as a Threat

by Anita Pugliese and Julie Ray



WASHINGTON, D.C. -- Gallup surveys in 111 countries in 2010 find Americans and Europeans feeling substantially less threatened by climate change than they did a few years ago, while more Latin Americans and sub-Saharan Africans see themselves at risk.



The 42% of adults worldwide who see global warming as a threat to themselves and their families in 2010 hasn't budged in the last few years, but increases and declines evident in some regions reflect the divisions on climate change between the developed and developing world.



Majorities in developed countries that are key participants in the global climate debate continue to view global warming as a serious threat, but their concern is more subdued than it was in 2007-2008. In the U.S., a slim majority (53%) currently see it as a serious personal threat, down from 63% in previous years.



Concern about global warming has also declined across western, southern, and eastern Europe, and in several cases, even more precipitously than in the U.S. In France, for example, the percentage saying global warming is a serious threat fell from 75% in 2007-2008 to 59% in 2010. In the United Kingdom, ground zero for the climate data-fixing scandal known as Climategate in 2009, the percentage dropped from 69% to 57% in the same period.



World residents' declining concern about climate change may reflect increasing skepticism about global warming after Climategate and the lack of progress toward global climate policy. The drops also may reflect the poor economic times, during which Gallup research generally finds environmental issues become less important.



More Latin Americans, Sub-Saharan Africans See Danger



Latin Americans, who already were among the most aware of climate change and the most likely to view global warming as a personal threat, became even more aware and more concerned in 2010. Seventy-seven percent of Latin Americans claim to know at least something about climate change, and nearly as many see it as a personal threat (73%).



These relatively high figures among Latin Americans may be partly attributable to the bad rainy seasons and flooding that leaders in the region such as Venezuelan President Hugo Chavez have linked to global warming. Countries that were hit particularly hard by floods, such as Ecuador and Venezuela, saw residents' likelihood to view global warming as a threat surge in 2010.



In sub-Saharan Africa, where populations are likely to be vulnerable to the effects of climate change, awareness is still among the lowest in the world, but was up in 2010. Nearly half of the adult population in the region (46%) say they are aware of climate change, up from 38% in 2007-2008. Correspondingly, the percentage who perceive climate change as a serious threat increased slightly.



Implications



The feuding between rich and poor nations at climate talks in Bangkok in April demonstrates the obstacles that remain before the world can agree on a climate policy. Gallup's data show that fewer Americans and Europeans, whose nations are central players in these talks, feel threatened by global warming today than they did in recent years. However, majorities in many of these countries still see climate change as a serious threat, which means the issue remains personally important to them.



For full country results, see page 2.



Visit Real Clear World's Top 5s feature to learn more about the countries concerned about global warming.



For complete data sets or custom research from the more than 150 countries Gallup continually surveys, please contact SocialandEconomicAnalysis@gallup.com or call 202.715.3030.



Survey Methods



Results are based on face-to-face and telephone interviews conducted in 2010 with approximately 1,000 adults, aged 15 and older, in 111 countries. For results based on the total sample in each country, one can say with 95% confidence that the maximum margin of sampling error ranges from ±1.7 percentage points to ±5.7 percentage points. The margin of error reflects the influence of data weighting. In addition to sampling error, question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of public opinion polls.



remainder at gallup.com

WSJ: "The climate-refugee prediction isn't the first global warming-related claim that has turned out to be laughable"

Climate Refugees, Not Found



REVIEW & OUTLOOK  APRIL 21, 2011 WSJ.com

Discredited by reality, the U.N.'s prophecies go missing.



In 2005, the U.N. Environment Program (UNEP) published a color-coded map under the headline "Fifty million climate refugees by 2010." The primary source for the prediction was a 2005 paper by environmental scientist Norman Myers.



Six years later, this flood of refugees is nowhere to be found, global average temperatures are about where they were when the prediction was made—and the U.N. has done a vanishing act of its own, wiping the inconvenient map from its servers.



The map, which can still be found elsewhere on the Web, disappeared from the program's site sometime after April 11, when Gavin Atkins asked on AsianCorrespondent.com: "What happened to the climate refugees?" It's now 2011 and, as Mr. Atkins points out, many of the locales that the map identified as likely sources of climate refugees are "not only not losing people, they are actually among the fastest growing regions in the world."



The program's spokesman tells us the map vanished because "it's not a UNEP prediction. . . . that graphic did not represent UNEP views and was an oversimplification of UNEP views." He added that the program would like to publish a clarification, now that journalists are "making hay of it," except that the staffers able to do so are "all on holiday for Easter."



The climate-refugee prediction isn't the first global warming-related claim that has turned out to be laughable, and everyone can make mistakes. More troubling is the impulse among some advocates of global warming alarmism to assert in the face of contrary evidence that they never said what they definitely said before the evidence went against them.





These columns have asked for some time how anyone can still manage to take the U.N.-led climate crowd seriously. Maybe the more pertinent question is whether the climateers have ever taken the public's intelligence seriously.

Farewell to Dr. Noor van Andel

According to the climategate.nl blog, Dr. Noor van Andel, a Netherlands prominent skeptic of anthropogenic global warming, died yesterday. Dr. van Andel's papers showing climate changes are not caused by greenhouse gases are featured here and a farewell tribute is here.

Volt-killer: Ford's 80 MPG new car






Ford Motor Co.
Ford Focus ECOnetic.
Ford Motor Co. says it will unveil a special high-fuel-mileage version of the Focus compact car next week at the Amsterdam Motor Show. While the car, called the Focus ECOnetic, is designed for the European market it is bound to attract the attention of U.S drivers who increasingly consider fuel economy a priority.
The car maker says the new model combines many fuel saving technologies that are expected to make it the most fuel-efficient compact car on the European market, including gasoline-, diesel- and hybrid-powered vehicles.
Indeed, if Ford’s estimates of about 80 miles per gallon are accurate, the car will use less fuel than many motorcycles and scooters.
Ford says it expects the ECOnetic’s 1.6-liter diesel engine to consume use less than 3.5 liters of fuel per 100 kilometers, which is how they measure mileage in Europe. On my conversion charts that comes out to about 67 miles per gallon. Ford says it expects 80 mpg. Industry experts including officials at Ford say differences in the way cars are tested for fuel economy in the U.S. and Europe can make the results of conversion charts inaccurate.
Either way, though, the ECOnetic’s fuel economy would exceed that of hybrid cars and even many two-wheel vehicles sold in the U.S.
To boost fuel economy the car uses a long list of features that each increase efficiency by small increments. They range from an aerodynamic body, diesel engine with high-pressure fuel injection and special gearing, to tires and transmission oil designed to reduce rolling resistance and mechanical friction.
The Ford also has an automatic stop-start system that turns the engine off when the car is stopped at a traffic light or in other situations where extended idling would waste fuel. Regenerative charging uses braking energy to help charge the battery. A driver-information system called Eco Mode monitors driving styles and gives drivers advice on how they could save more fuel by driving more efficiently.
Ford says the car will arrive in dealer showrooms (in Europe, not U.S.) early next year. It will be available as a five-door hatchback or station wagon.



Plus it will likely sell for about half the cost of a Chevy Volt 

Saturday, April 16, 2011

When Scientists Confuse Cause and Effect

excerpt from Matt Ridley in the Wall Street Journal 4/16/11:



Even climate science has encountered cause-effect confusion. When in 1999 Antarctic ice cores revealed carbon-dioxide concentrations and temperature marching in lockstep over 400,000 years, many—including me— found this a convincing argument for attributing past climate change to carbon dioxide. (About 95% of carbon dioxide in the atmosphere is natural, coming from the exhalations of living things. In the past, carbon-dioxide levels rose as the earth warmed at the end of ice ages and fell as it cooled at the end of interglacial periods.)



Then four years later came clear evidence from finer-grained analysis of ice cores that temperature changes preceded carbon-dioxide changes by at least 800 years. Effects cannot precede their causes by eight centuries, so temperatures must drive carbon dioxide, chiefly by warming the sea and causing carbon dioxide dissolved in water to "out-gas" into the air.



Climate scientists fell back on a "feedback" hypothesis, arguing that an initial change, probably caused by variations in the earth's orbit that affect the warmth of the sun, was then amplified by changes in carbon-dioxide levels. But this made the attribution argument circular and left the reversal of the trend after a period of warming (when amplification should be at its strongest) still harder to explain. If carbon dioxide is still driving the temperature upward but it falls instead, then other factors must be stronger than expected.



Some climate scientists see cause-effect confusion at the heart of climate modeling. Roy Spencer of the National Aeronautics and Space Administration argues from satellite data that the conventional view has one thing backward. Changes in cloud cover are often seen as consequences of changes in temperature. But what if the amount of cloud cover changes spontaneously, for reasons still unclear, and then alters the temperature of the world by reflecting or absorbing sunlight? That is to say, the clouds would be more cause than consequence. Not many agree with Mr. Spencer, but it is an intriguing idea.

Revkin of NYT takes back his statement that skeptics are more knowledgeable about the science

Tom Nelson featured a surprising quote from warmist/alarmist Andrew Revkin of the New York Times in the article Climate, Communication and the ‘Nerd Loop’:

The last link is particularly important, given that it shows, among other things, that those dismissing human-driven global warming tend to have a more accurate picture of the basic science than those alarmed by it.
The quote has since disappeared, now replaced by:

10:46 p.m. | Updated I’ve removed a line I’d tacked on here that gave too simplistic a summary of the Six Amercias [sic] study
The Yale University Six Americas study in fact states in the Executive Summary on page 4:

...this study also found that for some knowledge questions the Doubtful and Dismissive [skeptics of man-made global warming] have as good an understanding, and in some cases better, than the Alarmed and Concerned.
see the report for specific examples.

Another blow to warmist theory: Decreasing radiation from greenhouse gases

The anthropogenic global warming theory is based upon the notion that increasing 'greenhouse gases' will increase infrared 'back-radiation' to the earth to [supposedly] warm the planet. The theory also claims that increases in the minor 'greenhouse gas' carbon dioxide will cause increases in the major 'greenhouse gas' water vapor to amplify the infrared 'back-radiation' and global warming. A study published online yesterday in The Journal of Climate, however, finds that contrary to the global warming theory, infrared 'back-radiation' from greenhouse gases has declined over the past 14 years in the US Southern Great Plains in winter, summer, and autumn. If the anthropogenic global warming theory was correct, the infrared 'back-radiation' should have instead increased year-round over the past 14 years along with the steady rise in atmospheric carbon dioxide. 



Journal of Climate 2011 ; e-View

doi: 10.1175/2011JCLI4210.1



Long-Term Trends in Downwelling Spectral Infrared Radiance over the U.S. Southern Great Plains



P. Jonathan Gero, Space Science and Engineering Center, University of Wisconsin–Madison, Madison, Wisconsin



David D. Turner, NOAA / National Severe Storms Laboratory, Norman, Oklahoma and Department of Atmospheric and Oceanic Sciences, University of Wisconsin–Madison, Madison, Wisconsin



Abstract: A trend analysis was applied to a 14-year time series of downwelling spectral infrared radiance observations from the Atmospheric Emitted Radiance Interferometer (AERI) located at the Atmospheric Radiation Measurement (ARM) site in the U.S. Southern Great Plains. The highly accurate calibration of the AERI instrument, performed every 10 minutes, ensures that any statistically significant trend in the observed data over this time can be attributed to changes in the atmospheric properties and composition, and not to changes in the sensitivity or responsivity of the instrument. The measured infrared spectra, numbering over 800,000, were classified as clear-sky, thin cloud, and thick cloud scenes using a neural network method. The AERI data record demonstrates that the downwelling infrared radiance is decreasing over this 14-year time period in the winter, summer, and autumn seasons but is increasing in the spring; these trends are statistically significant and are primarily due to long-term change in the cloudiness above the site. The AERI data also show many statistically significant trends on annual, seasonal, and diurnal time scales, with different trend signatures identified in the separate scene classifications. Given the decadal time span of the dataset, effects from natural variability should be considered in drawing broader conclusions. Nevertheless, this data set has high value due to the ability to infer possible mechanisms for any trends from the observations themselves, and to test the performance of climate models.

Friday, April 15, 2011

New material posted on the NIPCC website

Sea Level Rise Around Mainland Australia (12 Apr 2011)

Although the four data sets employed in this study all show short-term accelerations in sea level rise near the end of the 20th century, the century as a whole was one of decelerating sea level rise, which is not exactly in harmony with the climate-alarmist contention that the 20th century experienced a warming that rose at a rate and to a height that were both unprecedented over the past millennium or more ... Read More


The Carbon Sink of an Old-Growth Forest in China (12 Apr 2011)

The old notion of old trees contributing next to nothing to global carbon sequestration is manifestly invalid ... Read More


Excess Winter Mortality in Various Developed Countries and Its Implications for Mitigation Policies (12 Apr 2011)

A general warming may reduce mortality and extend life expectancies, at least in the temperate and higher latitudes ... Read More


The Response of High Arctic Tundra to Long-Term Warming (12 Apr 2011)

Long-term warming in the High Arctic will likely enhance plant cover ... Read More


Tropical Cyclones of the North Indian Ocean (13 Apr 2011)

Results suggest a “decreasing trend in the frequency of storms over the Bay of Bengal, contrary to the popular belief that there will be an increase” ... Read More


Evolutionary Responses of a C3 Perennial Herb to Elevated CO2 (13 Apr 2011)

Both “phenotypic and genetic differences have occurred between high and normal CO2 populations” ... Read More


Snowfall and Snowstorms are Not Decreasing as Predicted by Climate Projections (13 Apr 2011)

Since 2007 heavy snowstorms and all-time seasonal records for stations and the Northern Hemisphere have challenged the predictions by the IPCC, NOAA CCSP, the Hadley Center/UKMO and environmental groups like the Union of Concerned Scientists that snowfall and snowstorms were growing increasingly rare and extent was declining. An analysis of pace and time distribution of winter storms by Changnon in 2008, NOAA weekly snow extent data from satellite as compiled by Rutgers Snow Lab and an objective snowstorm index developed by NOAA NCDC gives us a chance to test that hypothesis ... Read More


Precipitation Events in Northern New England, USA (13 Apr 2011)

Are they becoming more extreme? ... Read More


Coral Disease in a Warmer World (13 Apr 2011)

Researchers emphasize “the need to move away from projections based on historic trends toward predictions that account for novel behavior of ecosystems under climate change” ... Read More

Tuesday, April 12, 2011

The Green Energy Economy Reconsidered

Forbes.com  BY Jerry Taylor and Peter Van Doren 04.25.11



"Green" energy, such as wind, solar and biomass, presently constitutes only 3.6% of fuel used to generate electricity in the U.S. But if another "I Have a Dream" speech were given at the base of the Lincoln Memorial, it would undoubtedly urge us on to a promised land where renewable energy replaced fossil fuels and nuclear power.



How much will this particular dream cost? Energy expert Vaclav Smil calculates that achieving that goal in a decade--former Vice President Al Gore's proposal--would incur building costs and writedowns on the order of $4 trillion. Taking a bit more time to reach this promised land would help reduce that price tag a bit, but simply building the requisite generators would cost $2.5 trillion.



Let's assume, however, that we could afford it. Have we ever seen such a "green economy"? Yes, we have--in the 13th century.



Renewable energy is quite literally the energy of yesterday. We abandoned "green" energy centuries ago for five very good reasons.



First, green energy is diffuse, and it takes a tremendous amount of land and material to harness even a little bit of energy. Jesse Ausubel, director of the Program for the Human Environment and senior research associate at Rockefeller University, calculates, for instance, that the entire state of Connecticut (that is, if Connecticut were as windy as the southeastern Colorado plains) would need to be devoted to wind turbines to power the city of New York.



Second, it is extremely costly. In 2016, according to President Obama's own Energy Information Administration estimates, onshore wind (the least expensive of these green energies) will be 80% more expensive than combined-cycle, gas-fired electricity. And that doesn't account for the costs associated with the hundreds of billions of dollars' worth of new transmission systems that would be needed to get wind and solar energy--which is generally produced far from where consumers happen to live--to ratepayers.



Third, it is unreliable. The wind doesn't always blow and the sun doesn't always shine when the energy is needed. We account for that today by having a lot of coal and natural gas generation on "standby" to fire up when renewables can't produce. But in a world where fossil fuels are a thing of the past, we would be forced--like the peasants of the Dark Ages--to rely upon the vagaries of the weather.



Fourth, it is scarce. While wind and sunlight are obviously not scarce, the real estate where those energies are reliably continuous and in economic proximity to ratepayers is scarce.



Finally, once the electricity is produced by the sun or wind, it cannot be stored because battery technology is not currently up to the task. Hence, we must immediately "use it or lose it."



Fossil fuels are everything that green energy is not. They are comparatively cheap. They are reliable; they will burn and produce energy whenever you want it. They are plentiful (we use only a tiny bit of oil in the electricity sector). And you can store fossil fuels until you need them.



Proponents of green energy argue that if the government can put a man on the moon, it can certainly make green energy economically attractive. Well, notice that the government was not trying to get a man to the moon profitably, which is more akin to the challenge here. Even before the Obama presidency began, about half the production costs of wind and solar energy were underwritten by the taxpayer to no commercial avail. There's little reason to think that a more sustained, multidecade commitment to subsidy would play out any differently. After all, the federal government once promised that nuclear energy was on the cusp of being "too cheap to meter." That was in the 1950s. Sixty-one billion dollars of subsidies and impossible-to-price regulatory preferences later, it's still the most expensive source of conventional energy on the grid.



The fundamental question that green energy proponents must answer is this: If green energy is so inevitable and such a great investment, why do we need to subsidize it? If and when renewable energy makes economic sense, profit-hungry investors will build all that we need for us without government needing to lift a finger. But if it doesn't make economic sense, all the subsidies in the world won't change that fact.



Taylor and Van Doren are senior fellows at the Cato Institute