Competitive Enterprise Institute | 1899 L ST NW Floor 12, Washington, DC 20036 | Phone: 202-331-1010 | Fax: 202-331-0640
Energy Secretary: Global Warming Message Not Getting Out
The Clinton administration’s new Energy Secretary Bill Richardson recently remarked that the administration has been "out-gunned in the Congress, [and] in media ads," foiling its efforts to get the word out about global warming.
"We have to do better. And what we need to do is find ways that we can communicate why it’s important – climate change, agricultural disasters, water rising, ozone layer – why that is important to the American people," said the energy secretary. "We need to do a lot better there and we need to be committed towards not just international treaties, but delivering the message to Congress and the American people."
If the public has not embraced the administration’s energy use controls and other global warming prevention measures, it is not because the White House has expended too little effort. As Cooler Heads has documented thoroughly, Bill Clinton, Al Gore, and many other administration officials have trumpeted global warming warnings at press conferences throughout the year. Heat waves, tornadoes, and violent storms have all been blamed on man-made global warming.
The federal government has spent millions of taxpayer dollars on programs to promote the global warming scare. The EPA alone has distributed approximately $30 million to greenhouse lobby groups, such as the Climate Institute, the American Council for an Energy Efficient Economy, and the World Resources Institute. Additional millions are spent by private foundations in an attempt to convince Americans to go on an energy diet ("Deep pockets, Hot Air," Washington Times, August 31, 1998).
Gore’s "Hot" Data Not Peer Reviewed
At the beginning of 1998, Vice President Al Gore held a press conference to announce nationally that 1997 was the hottest year on record. Every month since, he has announced a new record high for each month. Unfortunately, the Vice President has been relying on data that has never been peer reviewed.
The un-refereed material was "developed for political impact" by the Commerce Department’s National Climatic Data Center, according to University of Virginia climatologist Patrick Michaels. An e-mail distributed by the NCDC admits "our methodology was not documented in the open refereed literature," and states that "This [memorandum] is an attempt to provide documentation."
It turns out, says Michaels, that the data cited is not a record of global temperatures, but rather an "index" combining three different measures. These measures include land surface temperatures, sea surface temperatures taken from ships, and temperatures taken from a network of buoys deployed in the 1980s. The sea surface temperatures were adjusted upward by 25 percent after 1982 in order to calibrate it with land surface temperatures. The result of this unorthodox adjustment is that recent years appear warmer in "indexed" terms.
Michaels also points out that the sea surface temperatures used are inconsistent with the air temperatures above the ocean, known as marine air temperatures. The marine air temperatures, however, match up nearly perfectly with the balloon radiosonde and satellite temperature data, and show no warming over the last 20 years (Washington Times, August 31, 1998).
OECD Ignores Technology Trends, Forecasts Oil Shortage
The International Energy Agency of the Organization of Economic Cooperation and Development (OECD) predicts that world oil production will peak in as little as ten years. Sometime between 2010 and 2020, production is projected at 80 million barrels per day and then will begin a steady decline.
In the 1970s, we were told that the world was running out of oil and the only solution was to cut energy consumption. The old-school doomsayers are back, warning once again of an oil shortage.
Other estimates, taking into account technology and rising production capacity, differ from the OECD. The U.S. Department of Energy’s Energy Information Administration does not project a peak in oil production until well after 2020. Other optimists see reserves growing rapidly through technological developments, which allow explorers to extract more oil from established oil fields. "Technology has managed to offset the increasing cost of finding and retrieving new resources," says Douglas Bohi, an economist with Charles River Associates in Washington, D.C. "The prospect is out there for an amazing increase in the [oil] reserve base."
One new extraction technique reduces the costs of drilling by a factor of ten. It employs a method of drilling downward and then across, reducing the number of wells needed (Science, August 21, 1998). A brand new technology called Atomic Dielectric Resonance may massively increase the ability of explorationists to discover oil. It has already shown that it can distinguish gold from quartz in seams 10,000 feet under ground (The Scotsman, August 28, 1998). A chronic problem afflicting the doomsayers is the inability to predict future technological change. Without this ability prognosticators will invariably be wrong.
DOE Study Found to be Misleading
One of the first economic analyses used by the Clinton administration to downplay the costs of reducing carbon emissions was the "Five Lab" study done by the Department of Energy. The study’s conclusions, as found in the Executive Summary and the "Analysis Results" section, "are not derived from, nor supported by, the technical chapters that analyze each energy-using sector."
"Some of these main conclusions of the Five Lab study are merely ad hoc assumptions," according to a new report by Ronald J. Sutherland at the American Petroleum Institute. The paper, A Critique of the "Five Lab" Study, also claims that "the Five Lab study uses a methodology to estimate costs and benefits that is inconsistent with the economic principles of cost-benefit analysis."
The "Analysis Results" chapter shows the estimated reduction in carbon emissions under three scenarios: business as usual, efficiency and high efficiency/low carbon (HE/LC). The HE/LC scenario assumes a fee of $25 and $50 per ton of carbon dioxide. But, as Sutherland points out, "The results reported for a $25 and $50 carbon fee were not obtained from analyses in three of the four sectors." In fact, "there is no correspondence between the assumptions actually used in the analytical sections of this study and the $25 and $50 carbon permit fee." Thus the claim by the administration that the costs of carbon emission reductions are negligible are based on ad hoc assumptions not from the actual sector modeling analyses.
Other ad hoc assumptions found in the Analysis Result section include the discount rates used in the "Optimistic" (low discount rate) and "Pessimistic" (high discount rate) scenarios. Sutherland points out that "the discount rates do not appear in the sector modeling analysis, but are only discussed in the "Analysis Results" chapter as a key component of the Five Lab study." Surprisingly, the Five Lab study states: "These discount rates are not those that describe current market behavior, but rather are reflective of costs of capital if the market did invest in energy-efficiency measures." In other words the rates used were not actual rates.
The Five Lab study also claims that the benefits of reducing carbon emissions outweigh the costs. Sutherland shows, however, that the study ignores costs such as those "associated with policies designed to encourage technology adoption, such as rebates, subsidies and accelerated depreciation, higher energy prices imposed on consumers, nor the cost of prematurely retiring productive coal plants," among others. Finally, the study failed to estimate the monetary benefit of the climate change improvement. In short it got both the costs and benefits wrong.
SO2 Trading Costlier Than Claimed
A Public Utilities Fortnightly report (May 15, 1998) casts doubt on claims that SO2 trading is a workable model for carbon emissions trading. Evidently, the full costs of the sulfur reduction have not yet been realized and won’t be known until Phase II of the program is fully implemented.
Proponents of the Kyoto Protocol have pointed to the U.S. acid rain program as an example of how to reduce emissions inexpensively. Like the acid rain program, the argument goes, the cost of reducing greenhouse gases will be negligible.
Here’s why the acid rain program can’t be used for comparison purposes: Phase II of the SO2 program will require that all major "fossil units" participate, and the cap will be lower than in Phase I. Utilities over-complied during Phase I and "banked" their emissions allowances in anticipation of the lower Phase II cap. At some point during Phase II, utilities will fully draw down their banked allowances and the real costs of compliance will be revealed.
Currently allowances are selling for about $100 per ton even though the marginal cost of compliance is actually $500. This is because utilities had difficulty in estimating their marginal costs. As a result, they may have invested "too heavily in control measures, creating more allowances for sale than needed to achieve the cap in any given year," causing allowance prices to fall below actual marginal cost.
Once Phase II is fully implemented, long run marginal costs will equal the price of allowances. These costs should fall well within the range estimated by critics of sulfur emission controls.
Christy/Spencer Respond to Critics
For the last 18 years, John Christy at Earth Systems Science Laboratory (ESSL), University of Alabama, Huntsville and Roy Spencer at NASA/Marshall Space Flight Center have constructed a global temperature record using measurements from microwave sounding units aboard satellites. These data have confounded the warming predictions of climate models, and in fact show a cooling trend from 1979 to 1997.
Recently, the accuracy of these data have been challenged in the peer reviewed literature, the most important criticism coming last month (Cooler Heads, August 19, 1998) from Frank Wentz and Matthias Schabel (WS) of Remote Sensing Systems. They claimed that the satellite data is distorted by orbital decay. Christy and Spencer, along with Elena Lobl, (CSL) also of ESSL, in a new study published in the Journal of Climate (August 1998) painstakingly trace their methodology in constructing the temperature record. While the CSL paper was submitted prior to the publication of the WS paper, it does address the WS paper’s criticisms.
CSL show how they intercalibrate each of the eight satellites separately to remove the biases that result from various factors. Specifically, CSL performed the adjustment to account for drift-error and cyclic fluctuations. This is relevant to the WS article in that the analysis by CSL removed a large part of the bias created by orbital decay, even though they were not aware of it at the time.
CSL also responded to a paper in Nature (March 13, 1997) by James Hurrell and Kevin Trenberth (HT) of the National Center for Atmospheric Research. The HT article claimed to have discovered spurious downward jumps in the satellite record that resulted from changing the satellites. Removing the jumps changes the temperature trend from negative to positive, according to HT. After careful analysis, however, CSL "found no such jumps by comparison with independent satellite and traditional atmospheric measurements."
Water Vapor Still Not Resolved
One of the most important and least understood components of the global warming hypothesis is the role of water vapor feedback. Water vapor is by far the most powerful greenhouse gas and accounts for nearly all of the natural greenhouse effect.
According to global warming proponents, increases of carbon dioxide will warm the planet by slightly increasing evaporation and water vapor in the troposphere. This increase in tropospheric water vapor is what accounts for most of the warming in global warming projections.
The problem is that nobody knows for sure whether this feedback is positive (enhancing the effects of increased carbon dioxide) or negative (canceling the effects of carbon dioxide). Richard Lindzen, a climatologist at Massachusetts Institute of Technology, believes that the feedback will be negative, and that increased carbon dioxide will actually dry out the upper troposphere. A study last year in the Bulletin of the American Meteorological Society (June 1997) by Roy Spencer of NASA and William Braswell of Nichols Research Center found that the tropical free troposphere is much dryer than represented in the climate models – an early indication that Lindzen may be right.
An article in Science (August 21, 1998) discusses the difficulties in detecting a trend in the water vapor content of the troposphere. The entire enterprise is plagued with inadequate instrumentation and conflicting agreement between types of instruments. A change to better sensors may also give the false impression "that the upper troposphere is drying simply because of the better instrumentation."
The author of the article, David Rind of NASA, concludes, "so far, there has been no evidence to indicate that a strong negative water vapor feedback in the upper troposphere will in fact arise as climate warms. However, without our being able to observe upper tropospheric and stratospheric water vapor with sufficient accuracy over a long enough time period to see ongoing trends, some uncertainty will remain in this most important of climate sensitivity feedbacks."
THE COOLER HEADS COALITION
Alexis de Tocqueville InstitutionAmericans for Tax ReformAmerican Policy CenterAssociation of Concerned TaxpayersCenter for Security PolicyCitizens for a Sound EconomyCommittee for a Constructive TomorrowCompetitive Enterprise InstituteConsumer AlertDefenders of Property RightsFrontiers of FreedomGeorge C. Marshall InstituteHeartland InstituteIndependent InstituteNational Center for Policy AnalysisNational Center for Public Policy ResearchPacific Research InstituteSeniors Coalition60 PlusSmall Business Survival CommitteeThe Advancement of Sound Science CoalitionThe Heritage Foundation