You are here

Vol. V, No. 15

Cooler Heads Digest


Vol. V, No. 15

Finally someone has brought the climate change debate back down to earth



Triumph in Bonn?  Or Not!


On Monday, July 23, negotiators in Bonn struck an agreement claiming that they had succeeded in rescuing the Kyoto Protocol despite the U.S.’s refusal to endorse it.  Pundits across the globe celebrated the breakthrough proclaiming the world safe from greenhouse gases. 


“This first small step is a giant leap for humanity and for the future of our planet,” according to World Wildlife Fund’s Jennifer Morgan. “We have delivered probably the most comprehensive and difficult agreement in human history," said New Zealand delegate Peter Hodgson (Investor’s Business Daily, July 24, 2001).  And European Union environment commissioner, Margot Wallstrom, declared, “We have finalized the rescue operation. We have rescued the Kyoto protocol. It is a major achievement because we will live with this for many years to come” (The Glasgow Herald, July 24, 2001).


But is all this hyperbole justified?  As noted in the July 25 issue of the Los Angeles Times, “After a good night’s sleep and some sober contemplation, environmental activists Tuesday conceded the Kyoto Protocol adopted a day earlier falls far short of the lofty goals for fighting global warming contained in the original proposal.”


Indeed, the prognosis is even worse than portrayed  in  the  Los  Angeles Times.    Nothing


specific was agreed to.  For example, the delegates agreed to establish an adaptation fund for developing countries that would be funded by developed countries, but no agreement as to how much each country would contribute was reached.  They also agreed that funding for the Global Environment Facility should increase, but again no specifics were contemplated.


As noted by Cooler Heads Counsel, Chris Horner, who attended the Bonn conference, “Negotiators addressed specifics of some among the scores of Kyoto provisions, and some of those resulted in ‘agreement.’  Notwithstanding the absence of any ‘comprehensive’ detailing of specifics, however, the bulk of those agreements actually consist of vague palliatives with a promise to continue talking about the issue.  That is, for the most part there were merely agreements to agree at a later date.”


In what could be seen as a major defeat for the EU, it finally conceded the use of carbon sinks and emission trading on the insistence of Japan.  Last November at the Hague, the EU allowed negotiations to collapse rather than make similar concessions to the U.S.  With the U.S. out of the picture, however, Japan has become the key to bringing Kyoto into force because if it, along with the U.S., fails to ratify Kyoto it cannot become international law.


WWF’s Jennifer Morgan characterized the reaching of the agreement without the U.S. as a “geopolitical earthquake,” implying a shift in power, but Bush’s refusal to accept Kyoto forced the EU to make concessions that it had previously said were unacceptable if the treaty were to retain its “environmental integrity.”  Moreover, WWF estimates that the concessions will lower carbon emission reductions from 5.2 percent below 1990 levels to 1.8 percent below 1990 levels.


“The biggest problem,” according to the Electricity Daily (July 26, 2001), “is that, technically speaking, the FCCC Conference of the Parties meeting in Bonn is not adopting rules at all, it is adopting recommendations.  Moreover, these recommendations are addressed to a body that does not yet exist, and may not come into being for a long time.” 


Rules that are binding on the parties to the protocol cannot be made until the protocol is ratified and becomes international law and the “Conference of the Parties serving as the Meeting of the Parties,” or COP/MOP, is created.  “Every binding rule adopted in the Bonn agreement is carefully phrased as a recommendation to the COP/MOP, because the COP cannot make Protocol rules at this time,” says Electricity Daily. 


That, all along, has been the major barrier to ratification for most of the countries with targets and timetables.  Without knowing what the specific rules will be with regards to monitoring and enforcement, for instance, they are loath to ratify.  Yet rules cannot be made until Kyoto is ratified.


Finally, although Japan has tentatively agreed to the recommendations made in Bonn, there is still no guarantee that it will ratify Kyoto.  Indeed, a EU delegation source stated, “Ratification is by no means a foregone conclusion” (Agence France Presse, July 24, 2001).  Japan has continued to insist on U.S. participation. 


Australia has taken a similar stance.  Australia’s environment minister, Robert Hill said that, “At the end of the day there are some very good parts to this agreement for Australia but there are still some areas which we have concerns with.”  He also said, “you can’t have an effective global response without the U.S.”  Russia has also shown skepticism with the process and has yet to signal its willingness to ratify the treaty.




EIA Report Weighs Costs of Multipollutant Strategy


The Energy Information Administration (EIA) recently released a new study entitled “Strategies for Reducing Multiple Emissions from Electric Power Plants.” This study measures the costs of imposing simultaneous caps on power plant emissions of nitrogen oxides (NOx), sulfur dioxide (SO2), mercury (Hg), and carbon dioxide (CO2).


This new report by EIA follows up its December 2000 study on the projected costs of a three pollutant approach. It also sheds light on the prospective costs of recent legislative proposals, including H.R. 1335 (the Clean Power Act of 2001) and S. 1131 (the Clean Power Plant and Modernization Act of 2001) which advocate a four pollutant approach.


In the report, EIA points out that over the next 20 years NOx emissions are expected to rise slowly, SO2 emissions are expected to remain at year 2000 levels, and CO2 emissions are expected to increase steadily. The agency explains that, due to expanding electricity demand, a growing dependency on natural gas and the construction of a small number of new coal-fired power plants will cause CO2 emissions to rise.


The agency states that the effects of capping CO2 emissions at 1990 levels or at 7% below 1990 levels will force people to pay higher prices for electricity. Specifically, the report says “…Electricity prices are projected to be much higher when CO2 emissions are capped than when NOx, SO2, or Hg emissions are capped―43 percent higher in 2010 and 38 percent higher in 2020 than projected in the reference case. Consumers are expected to reduce their electricity consumption by 8 percent in 2010 and 12 percent in 2020 when faced with higher electricity prices.”


In addition, the agency explains that electricity prices could be substantially higher if natural gas prices turn out to be higher than projected. If CO2 caps are imposed, both domestic production and imports of natural gas must grow to meet electricity demands. Specifically, production of 0.8 trillion cubic feet from domestic sources and 2.3 trillion cubic feet from imports must be added over the next 20 years to meet the increased demand. This would require domestic natural gas producers to achieve record levels of output from 2005-2010 and would represent a serious challenge for the industry.


Therefore, the adoption of a four-pollutant approach would introduce serious uncertainty into America’s energy infrastructure. As unexpected fluctuations in natural gas prices contributed to the California electricity crisis, uncertainty in markets for coal, natural gas, and renewables could cause unanticipated problems for consumers. As the report states: “History does not offer clear guidance as to how the various markets might respond to changes as large as those required by the proposed emissions targets.”


In conclusion, a four pollutant approach to reducing power plant emissions would introduce tremendous uncertainty into the viability of American electricity markets. Dropping CO2 from the program would do a great deal to strip away some of this uncertainty and ensure that consumers can obtain the electricity they demand during the 21st century.




Global Warming Quantified?


One of the major shortcomings of the report by the United Nations’ Intergovernmental Panel on Climate Change is that it treats all of its scenarios, 35 in all, as equally likely.  Many scientists have commented that the high end of the IPCC’s estimate of future global warming, the range from 1.4 to 5.8 degrees C (2.5 to 10.4 degrees F), is highly unlikely.


A research group from MIT calculated that there is far less than one percent chance that temperatures will rise 5.8 degrees in the next 100 years.  They also calculated that there is a 17 percent chance that the temperature increase will fall short of 1.4 degrees.


In a new paper published in the July 20 issue of Science, Thomas Wigley, with the National Center for Atmospheric Research in Boulder, Colorado, and Sarah Raper, with the Climatic Research Unit at the University of East Anglia in England, attempt to quantify the likelihood that temperature rise will fall within the range predicted by the IPCC. 


What they found was that there is a 90 percent probability that temperatures will rise between 1.7 to 4.9 degrees C (3 to 8.8 degrees F) by 2100.  This range, note the authors, “is very large compared with the observed warming over the last century.”  They also conclude that the probability that temperatures will reach 5.8 degrees to be very low.


They come to this conclusion by attaching probability distributions, using IPCC values, to what they deem the most important uncertainties, such as climate sensitivity and the role of sulfate aerosols, in the climate models.  They then run a simplified climate model that is calibrated to the more complex global circulation models, to generate a probability distribution for thousands of combinations.


Although this paper is important due to the fact that it focuses on the uncertainties in climate modeling, it is important to understand what the paper actually says.  It’s not so much a prediction of how temperatures will change in the future, but a prediction of how the models behave.  In other words, if you run the model 100 times, it will give you a temperature rise that falls within 1.7 and 4.9 degrees 90 percent of the time.  “Our results are only as realistic as the assumptions upon which they are based,” say the authors.  Many of the IPCC’s assumptions are demonstrably false.


Monster Hurricanes: Global Warming or Global Alarming?


A July 19th CNN story on the increasing severity of hurricane activity in the Atlantic Ocean highlights the uncertainty inherent in long-term predictions of the effects of global climate change. CNN Miami Bureau Chief John Zarrella recently interviewed Christopher Landsea of the National Oceanic and Meteorological Laboratory/Hurricane Research Division about his recent article on hurricane activity in the journal Science.


In the interview, Landsea pointed out that a major upward shift in climate has been responsible for the increase in hurricane activity over the last six years. He anticipates this shift will continue for the next ten to 40 years. However, he acknowledges that the increase in injuries and property damage caused by hurricanes is due to population growth and economic development. Specifically, he states: “I think at this point the U.S. is so developed and there’s so many people along the coast that just about anywhere is a major disaster ready to happen.”


In his journal article entitled “The Recent Increase in Atlantic Hurricane Activity: Causes and Implications,” (Science, 20 July 2001), Landsea explains that the increased activity is caused by a simultaneous increase in sea surface temperatures and decreases in vertical wind shear. He points out that local conditions in the tropical Atlantic have a direct effect on the development of hurricanes. In addition, he states that the oceans provide the best indicators of long run variability for hurricane activity.


For historical perspective, he explains that from 1944-1970, the average number of major hurricanes in the North Atlantic Basin was 2.7. However, from 1971-1994, this number fell to 1.5. The recent upsurge has taken place from 1995-2000, during which the number rose to 3.8. He explains that 1997 was a year of below average activity because of the strong El Niño event that occurred.


As for whether the recent upward trend is due to global climate change, he states: “The historical multidecadal-scale variability in Atlantic hurricane activity is much greater than what would be ‘expected’ from gradual temperature increase attributed to global warming. There have been various studies investigating the potential effect of long-term global warming on the number and strength of Atlantic-basin hurricanes. The results are inconclusive. Some studies document an increase in activity while others suggest a decrease.” He concludes by offering a stern warning to policymakers that our nation’s emergency management infrastructure must be bolstered to counteract the threat of more severe hurricanes over the next decade.


Therefore, the threat of increased injuries and property damage due to more severe hurricane seasons is of serious concern. However, it is premature to blame the effects of this problem on global climate change. Policymakers would be wise to take Landsea’s recommendations into account when examining the perceived costs of global warming.




Alexis de Tocqueville Institution

Americans for Tax Reform

American Legislative Exchange Council

American Policy Center

Association of Concerned Taxpayers

Center for Security Policy

Citizens for a Sound Economy

Committee for a Constructive Tomorrow

Competitive Enterprise Institute

Consumer Alert

Defenders of Property Rights

Frontiers of Freedom

George C. Marshall Institute

Heartland Institute

Independent Institute

National Center for Policy Analysis

National Center for Public Policy Research

Pacific Research Institute

Seniors Coalition

60 Plus

Small Business Survival Committee