IER Comment on the Dubious Social Cost of Carbon, Part I

The Institute for Energy Research (IER) has formally submitted its Comment to the Office of Management and Budget on the Obama Administration’s use of the “social cost of carbon” as an input for federal regulatory action. This is a crucial topic that may significantly influence energy policy. Those who want the full details should click the link and read our full Comment, but in a series of posts I will walk IER readers through the most important points we raised.

In our Comment, we objected to use of the “social cost of carbon” (SCC) in federal policy on several grounds. We grouped our objections into two categories, theoretical and procedural. In the present post, I will discuss the theoretical objections, meaning that even on purely academic or scientific grounds, it is very dubious to use SCC as a concept for guiding federal policymakers.

The SCC Is Not an Objective “Fact” of the World

On the theoretical front, our main theme is that the “social cost of carbon” is not an objective fact of the world, analogous to the charge on an electron or the boiling point of water. Many analysts and policymakers refer to the “science being settled” and so forth, giving the impression that the SCC is a number that is “out there” in Nature, waiting to be measured by guys in white lab coats.

On the contrary, by its very nature the SCC is an arbitrary number, which is completely malleable in the hands of an analyst who can make it very high, very low, or even negative, simply by adjusting parameters. Precisely because the SCC even at a conceptual level is so vulnerable to manipulation in this fashion, the analysts giving wildly different estimates are not “lying.” As we will see, the estimates of the SCC in the peer-reviewed literature are all over the map, demonstrating that this is hardly a feature of the “outside world.”

Damage Functions and Discount Rates

Incidentally, our conclusion is shared by some other experts, even those who are in favor of a carbon tax. In a peer-reviewed article, MIT Professor Robert Pindyck writes that computer-generated SCC estimates are “close to useless” for guiding policymakers, and that the “damage functions” embedded within the computer models are “arbitrary” having no basis in either economic theory or empirical observation. (Full quotations and citations are provided in our Comment.)

To get a sense of just how divergent the computer models can be, consider the following chart, which is taken from the2010 Technical Support Document issued by the Obama Administration’s Working Group on the Social Cost of Carbon:

Annual Consumption Loss as a Fraction of Global GDP in 2100 Due to an Increase in Annual Global Temperature in the DICE, FUND, and PAGE models

Social Cost of Carbon Chart 1

Source: Figure 1A (page 9) of February 2010 Working Group TSD

As the diagram above indicates, the three computer models selected for the Working Group analysis yield different results. In particular, the FUND model (green line) shows much lower impacts from global warming, especially at higher temperatures. Indeed, the green line’s initial (and slight) dip into negative territory shows that the FUND model assumes global warming will shower the world with positive externalities up through about 3 degrees Celsius. The fact that the FUND model yields (moderate) net benefits from global warming in the initial stages will be very significant when we consider the role of discount rates in the analysis.

When estimating the social cost of carbon (SCC), the choice of discount rate is crucial, because the computer simulations of large climate change damages occur decades and even centuries in the future, and also because some models show net benefits from global warming through mid-century.

Indeed, the Working Group generates its estimates of the SCC by equally weighting the estimates provided by the three computer models discussed above (namely the PAGE, FUND, and DICE models). As the diagram above illustrates, in the early decades (while the earth has only warmed one to two degrees Celsius) the cumulative impact of global warming is either close to zero or even positive.

Therefore, the rate at which we discount future damages into present monetary terms will have an enormous impact on the estimated SCC. For example, in the May 2013 Working Group update, the SCC in the year 2010 was reported as $11/ton at a 5% discount rate, but $52/ton at a 2.5% discount rate. In other words, cutting the discount rate in half caused the reported SCC to more than quadruple. Policymakers and citizens should realize just how influential the choice of discount rate is, when it comes to the SCC.

The problem is that the choice of discount rate is not something that can be settled objectively through technical analysis. If policymakers were going to use market rates of interest, there might be some hope of objectivity. There would still be significant “wiggle room” by selecting the time periods and particular interest rates to use in the computation, but at least market rates are externally generated and, in principle, could be measured objectively.

However, the trend in both academia and in policymaking circles is to use discount rates that are influenced by philosophical and ethical considerations, not based solely on observed market returns.[1] Presumably the proponents of one discount rate versus another may have strong arguments on their side, but the critical point is that these “ethical” discount rates are subjective and in an important sense, arbitrary.

Published Estimates of SCC All Over the Map

We can also look at a survey of the published estimates of the SCC over time, to demonstrate just how malleable and “subjective” the concept really is. The following diagram is taken from Richard Tol’s 2011 survey of past literature:

Survey of Published Estimates of SCC That Use 3% “Pure Time Preference” Rate for Discounting (dot indicates individual estimate).

Social Cost of Carbon Chart 2

Source: Richard Tol. (2011) “The SCC,” ESRI Working Paper #377. 

The diagram above is quite striking. It shows that the 90% confidence interval of the “true” SCC has widened over the last two decades. This is not what one would expect from a maturing science that is honing in on the “true” value. Even more shocking, from 2006 onward (at least until the time of Tol’s survey, in 2011) the lower portion of the 90% confidence interval was in the negative region of the graph, meaning that one could not rule out (with 95% confidence[2]) the possibility that further carbon dioxide emissions at that point would benefit humanity at large (beyond the private benefits accruing to the emitters).

The final takeaway from the above diagram is the enormous dispersion in the point estimates of the SCC. In particular, the 2005 estimates show a range from about negative $5/ton up to an enormous $120/ton. (Note that the y-axis on the above chart refers to tons of carbon, not carbon dioxide. Thus these values would need to be multiplied by 3.67 to make them comparable to the SCC estimates that are typically used in U.S. policy discussions.) This chart alone should disqualify use of the SCC in federal regulatory analysis and rule-making.

Conclusion

In this blog post, we have summarized some of the key theoretical problems with using the social cost of carbon (SCC) as a concept in federal policymaking. Generating estimates of the SCC involves using computer models with (arbitrary) simulated damages that go out centuries in the future, and then the analyst must arbitrarily select a discount rate to convert those future damages into present-dollar terms. Because of these ingredients in the estimation process, an analyst can generate just about any “estimate” of the SCC he wants, including a negative one—which would mean carbon dioxide emissions confer third-party benefits on humanity, and (using the Administration’s logic) ought to receive subsidies from the taxpayer.

Obama’s Budget: A Masterpiece of Energy Wastefulness

President Obama just released his fiscal 2015 budget[i] that spends lavishly on his pet projects including his epically-misnamed “all of the above” energy program. But, instead of keeping miners in their coal jobs that historically supplied the nation with most of its electricity, he is putting his dollars on ‘clean coal and natural gas’; instead of opening more federal lands to oil and gas drilling, he is taking away standard tax deductions as applied to the oil industry and providing subsidies to inefficient technologies that supply just 2 percent of the nation’s energy. In other words, he just continues to propose that taxpayers throw even more good money after bad, in fulfillment of his campaign promise from 2008 to make “electricity prices necessarily skyrocket.” Let’s take a closer look at the President’s energy wish list.

Highlights of the Fiscal 2015 Budget Proposal

On Tuesday, March 4, President Obama sent Congress a $3.9 trillion budget with new revenues amounting to just $1 trillion over ten years coming mainly from tax increases. According to House Speaker John Boehner, “After years of fiscal and economic mismanagement, the president has offered perhaps his most irresponsible budget yet. Despite signing last year’s bipartisan budget deal — and touting it as an accomplishment — the president now proposes violating that agreement with a spending surge. What’s more, he proposes raising even more taxes — not to reduce the deficit but to spend more taxpayer money.”

Let’s look at the energy sector more closely.

The budget calls for increasing taxes on oil, gas, and coal that would amount to $4 billion annually and $48.8 billion over ten years.

The budget calls for $27.9 billion to fund the Department of Energy (DOE), an increase of 2.6 percent over last year’s funding levels. Of that, the Office of Energy Efficiency and Renewable Energy receives $2.3 billion, an increase of 20 percent from the fiscal 2014 spending bill, of which renewable energy gets a 16 percent increase. But for DOE’s Fossil Fuel program, funding is reduced from last year’s levels. Out of this reduced budget DOE is somehow supposed to make it possible for the United States to be able to use its enormous supplies of oil and gas and coal, despite the Administration’s regulatory war on each of these traditional energy sources.

Instead, the DOE Fossil Fuel program includes funding on research for the misnamed “clean coal and natural gas” technologies. (These programs are misnamed because they are concerned about reducing carbon dioxide emissions, but carbon dioxide emissions, no matter how concerned you are with global warming, are not dirty in any sense of the word “dirty” in English).  According to Secretary of Energy Moniz[ii], “Certainly, if you look decades ahead for natural gas — as with coal — to be a major player in a very low-carbon world, it will require CCS technology here, as well.” The DOE budget would also foster research on gas hydrates and an interagency collaboration on shale development.

According to Moniz, the DOE budget request would fund initiatives spanning different branches of the agency, including:

  • $314 million to create a “more secure, resilient and flexible electric grid that can withstand increasingly volatile storms linked to climate change;”
  • A funding boost of about $33 million for the agency’s Office of Electricity Deliverability and Energy Reliability — $180 million — to support “clean energy transmission, smart grid technology and cybersecurity;”
  • $192 million to study energy production and storage, carbon dioxide storage and the disposal of hazardous materials;
  • $57 million for research and demonstration of technologies to make power generation more efficient and cheaper;
  • $302 million to strengthen DOE from cyberattacks;
  • $39 million to support the agency’s new Office of Energy Policy and Systems Analysis which would probe fuels and infrastructure resilience as part of DOE’s Quadrennial Energy Review.

As for renewable energy, the budget calls for a permanent extension of the production tax credit (PTC) for wind power—a tax credit that expired at the end of 2013. It should be noted that even the wind lobby suggested phasing out the PTC over six years, so the President’s budget requests more subsidies for wind than the wind lobbyists asked for.  If extended, it would cost $19.2 billion over ten years. It also calls for extending a tax credit for cellulosic biofuels that also expired in 2013.

In addition, the budget calls for establishing an “Energy Security Trust” of $2 billion invested over 10 years. The money would be drawn from revenues generated from Federal oil and gas development and would help support research and development in technologies such as advanced vehicles that run on biofuels, electricity, renewable hydrogen and domestically-produced natural gas. The Administration has specifically clarified that this proposal, first floated last year, does not involve any new revenues to be generated from new energy activities that the Administration has opposed, such as ANWR, opening the OCS, etc., but instead will come from existing planned development. This is, therefore, a $2 billion gimmick that involves borrowing more money and acquiring more debt to pursue additional programs such as those that produced Solyndra.  Note also that the “Energy Security Trust” does not mention coal, despite the fact that the United States has more than 400 years of coal in its Demonstrated Reserve Base. If the President were truly serious about energy security, coal would be included.

The U.S. DOE portion of the President’s budget also provides $253 million for development and demonstration of advanced biofuels, specifically mentioning “drop in” replacements for gasoline, diesel and jet fuel. This begs the question, “why?” The U.S. has billions of barrels of conventional oil that would not require hundreds of millions in subsidies, but would generate billions of dollars of revenue for the federal government if the Obama administration would allow more exploration and production.

As in the past, the president’s budget contains no funding for Yucca Mountain. Instead, it funds a program for creating a pilot interim storage facility by 2021, a larger interim facility by 2025 and a final repository more than two decades later that is expected to cost $5.7 billion during the first decade.[iii]

As the president recently announced, the budget calls for a new $1 billion “climate resilience fund” that could be spent on whatever he wishes, to include research into the effects of climate change, development of new technologies and climate-resilient infrastructure, and helping communities address the local effects of climate change.

What’s Missing

While Secretary Moniz touts the President’s “all of the above” energy strategy, he is either ignorant of what the word “all” means or he is being intentionally deceptive because the President’s plan is missing any promotion of conventional energy sources. It does not support opening new Federal lands to oil and gas development. It does not call for additional leasing of currently opened Federal lands to natural gas, coal, and oil development for which the Department of Interior has drastically reduced lease sales and slowed the issuance of permits. It does not cancel the onerous regulations that the Environmental Protection Agency (EPA) is mandating on coal-fired power plants and coal mines that have cost many miner jobs in coal-producing states nor does it provide these miners with any hope of jobs with commensurate salaries. As a result, it does not provide the American public with a safe, affordable, and secure energy future despite the trillions of dollars that would be spent if enacted.

Forecasters are all indicating that our energy future, at least through 2040, will be based on fossil fuels. For example, the Energy Information Administration (EIA) believes that 80 percent of our energy consumption in 2040 will be from fossil fuels, with an additional 8 percent from nuclear energy. Yet these fuels are getting the short shrift of President Obama’s energy budget so that his ‘pet’ technologies can eke out another fraction of a percent of the energy pie.

Let’s look as Obama Administration’s track record.

The following chart shows the average number of new leases on Federal lands that the Bureau of Land Management issued during each administration. While the trend is down under all administrations covered, the Obama Administration had the lowest—half that of the Clinton Administration and a third less than the George W. Bush Administration.

BLM2

In President Obama’s first term from 2009 to 2012 a total of 6.9 million acres were leased on Federal lands– less than half of the 15.9 million acres leased under George W. Bush between from 2005 and 2008.

BLM21

Source: BLM Oil and Gas Statistics for Fiscal Years 1988 – 2012

The Interior Department has leased just 2 percent of federal offshore areas and less than 6 percent of federal onshore lands for oil and gas development. This is particularly important because, while the entire United States including Alaska and Hawaii consists of 2.271 billion acres, the government owns mineral access to 2.4 billion acres because of the Outer Continental Shelf. Despite a large endowment of oil and natural gas resources on federal lands, which include offshore resources, oil and natural gas production is declining on federal lands in the United States.

The graph below shows the share of oil and natural gas production that came from federal lands. The share of oil production from federal lands peaked in fiscal year 2010 at 36.4 percent, but has declined by 10 percentage points to just 26.2 percent in fiscal year 2012. Natural gas production on federal lands peaked at 35.7 percent in fiscal year 2003, the first year that the Energy Information Administration reports the data, and has declined ever since reaching half that share in fiscal year 2012–-17.8 percent.

BLM2

Source: Energy Information Administration, http://www.eia.gov/analysis/requests/federallands/

The falling production on federal lands is in stark contrast to the dramatically increasing production on private and state lands, for which President Obama likes to take credit. According to a recent report from the Congressional Research Service, from 2007 through 2012, oil production grew by 35 percent and natural gas production grew by 40 percent on private lands while oil production fell 4 percent and natural gas production fell 33 percent on federal lands.

Oil and gas on federal lands

 Oil companies prefer to drill for oil on private and state lands because there is a lot less red tape, and the state regulatory agencies work closely with the oil companies to provide certainty in the regulatory process. For example, the state of Texas processes a drilling permit in 5 days, and North Dakota in 20 to 30 days, while the federal government’s Bureau of Land Management now takes over 200 days to process a permit, an increase of almost 50 percent since 2005.

Days-requiredSource: http://www.instituteforenergyresearch.org/2013/10/16/forty-years-after-the-oil-embargo/

Likewise, coal production on federal lands is also declining under the Obama Administration. Coal production on federal and Indian lands peaked at 509 million short tons in fiscal year 2008 and has been decreasing slightly each year since then. In fiscal year 2012, coal sales from production on federal and Indian lands reached 461 million short tons, a 1.7-percent decrease from fiscal year 2011 and over a 9-percent decrease since the peak in fiscal year 2008. According to data from the Bureau of Land Management, there have been fewer coal lease sales on average under the Obama Administration than there have been under the George W. Bush and the Bill Clinton administrations.

Coal Production Federal and Indian Lands

President Obama’s war on coal continues with onerous regulations on coal-fired power plants.  As of December 2013, reported retirements of coal-fired power plants by electric utility companies have grown to over 40 gigawatts. But, the EIA forecasters believe that the number of coal-fired retirements will be higher at 60 gigawatts, about 20 percent of the 310 gigawatts of coal-fired capacity that was operating in 2012. Ninety percent of those retirements are expected to occur by 2016, coinciding with the first year of enforcement for EPA’s Mercury and Air Toxics Standards.

Coal Plant Retirements

Source: Energy Information Administration, http://www.eia.gov/todayinenergy/detail.cfm?id=15031

According to EIA data, coal-fired generation declined from almost 50 percent of the electricity market in 2008 to as low as37 percent in 2012 due to competition from low natural gas prices and EPA regulations. With its major consuming sector reducing demand, coal production declined by 15 percent between 2008 and 2013. Along with the production declines, average coal mine employment fell as well.

According to the U.S. Mine Safety and Health Administration, the average number of coal mine employees in 2013 fell by almost 10 percent from 2012 levels, to 82,338 employees. In the fourth quarter of 2013, the average number of coal mine employees was lower at 77,639, the lowest level since the first quarter of 2009. Between the fourth quarter of 2011 and the fourth quarter of 2013, average coal mine employment dropped by over 17 percent.

Conclusion

President Obama’s “all of the above” energy strategy is an egregious misnomer and does harm to the English language. It is terms like this that George Orwell was describing when he wrote, “Political language — and with variations this is true of all political parties, from Conservatives to Anarchists — is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.” President Obama’s 2015 budget request is pure wind.

Californiacation of America: EPA’s Tier 3 Standards

On Monday, March 3, the Environmental Protection Agency (EPA) announced its Tier 3 regulations requiring U.S. refineries to further reduce their sulfur emissions from 30 parts per million to 10 parts per million beginning in 2017. EPA claims that to further reduce sulfur emissions would only increase gasoline prices by a fraction of a cent, but gasoline prices in the real world and the refinery industry refutes that claim. EPA says this plan follows California—and California has the highest gasoline prices in the continental U.S.  The refining industry explains that gasoline prices could go up by as much as 9 cents per gallon because the regulation will cost the industry $10 billion[i] in equipment costs and $2.4 billion in annual compliance costs.[ii]

Refinery Industry Already Reduced Sulfur By 90 Percent

About a decade ago, U.S. refineries reduced the sulfur contained in gasoline by 90 percent from 300 parts per million of sulfur to the current 30 parts per million under EPA’s Tier 2 program, and environmental benefits are continuing to occur as the U.S. vehicle fleet continues to turn over. Removing the first 90 percent of sulfur from gasoline is done fairly easily but removing the last 10 percent of the sulfur is much harder, costing the industry $10 billion due to the expensive and energy-intensive equipment that is required. That equipment will actually produce more carbon dioxide emissions than the tailpipe emissions would save and will increase gasoline manufacturing costs by between six and nine cents per gallon, according to an analysis by Baker and O’Brien.

The Tier 3 Rule

The Tier 3 standards will not only require that refiners cut sulfur in gasoline from 30 parts per million to 10 parts per million by 2017, it will also require that volatile organic compounds and nitrogen oxides be reduced by 80 percent and particulate emissions by 70 percent.

The Jan. 1, 2017, compliance date for the Tier 3 rule gives most refiners less than three years to engineer and deploy the expensive renovations. The refinery industry, which is faced with a shortage of skilled labor and other regulations that would contribute to the renovations, is finding the schedule tight, which will only increase the costs of compliance.

According to EPA, some concessions are embedded in the program. EPA is working on flexibility provisions for small refiners. And, refiners can earn credits for early compliance — or purchase credits to buy time when credits are available through the averaging, banking, and trading program. Refiners can also carry over credits from the Tier 2 trading program if they have them.[iii]

The Automobile Industry’s Take

The auto industry does not oppose the rule, even though it is estimated that the rule will cost automakers about $15 billion over 10 years, adding about $72 to the price of a new vehicle by 2025.[iv] The auto industry’s reasoning to not oppose the Tier 3 rule is that complying with the new gasoline regulation will help the auto industry more easily meet the Obama administration regulations on tightening vehicle fuel economy standards.  Also, automakers would rather have standards that require more effort and expense on behalf of the fuel makers, rather than assume the entire burden of cleaner onboard technologies which would drive up their costs. Furthermore, even though the federal government finally sold their last shares of GM stock in December, the federal government exerts a lot of control over the car makers. After the auto bailouts, the car makers understand that they can get bailed out if they don’t complain too much.

California

In pitching this new regulation to reporters, EPA Gina McCarthy stressed how this rule follows California. “This isn’t an innovative new concept,” McCarthy told reporters. “This gasoline is already in use in California and in other countries.” That may be true, but what about the price of gasoline in California compared to the rest of the country. Here’s GasBuddy.com’s heat map of gasoline prices:

So yes, by following California, the U.S. is headed for higher gasoline prices.

Conclusion

Here again, EPA is forcing American industry to meet an onerous standard when that industry has already reduced sulfur from gasoline by 90 percent. To further reduce it by another 60 percent will cost consumers 6 to 9 cents per gallon—much more than EPA’s estimate of less than a penny.  This is aggravated by the tight deadline, which means that essentially all refineries will be demanding the same services from a limited pool of providers who engineer and install mandated equipment, and refineries will necessary go through downtimes while the equipment is being installed.

According to American Fuel & Petrochemical Manufacturers (AFPM) President Charles Drevna, “EPA chose to ignore our concerns by setting an unrealistic compliance date of January 1, 2017, which does not provide refiners adequate time to complete the required projects necessary to meet the new standard in a manner that avoids the potential for supply disruptions. Tier 3 will provide little, if any, benefit, while increasing fuel manufacturing costs on the backs of American consumers.”[v]

In an economy where consumers are struggling, increasing the costs of their transportation through a variety of new regulations imposed from Washington risks scuttling economic growth.  Those who spend more money on their essential transportation because the government forces it upon them have less to spend on other goods and services, making America poorer.


[ii] Energy Guardian, McCarthy rejects industry pleas, finalizes Tier 3 gasoline rule, March 3, 2014

[iii] Environmental Protection Agency, EOA Sets Tier 3 Motor Vehicle Emission and Fuel Standards, http://www.epa.gov/otaq/documents/tier3/420f14009.pdf

[iv] Energy Guardian, EPA finalizes Tier 3 gasoline sulfur rule, March 3, 2014 and Greenwire, EPA unveils final rule for curbing sulfur in gasoline, March 3, 2014, http://www.eenews.net/greenwire/2014/03/03/stories/1059995437

[v] American Fuel & Petrochemical Manufacturers, AFPM Responds to EPA Tier 3 Rule, March 3, 2014, http://www.afpm.org/news-release.aspx?id=4113

California: Carbon Tax Hurts Just like Cap-and-Trade

A recent article in the LA Times by Jon Healey discusses the proposal by California State Senate President Darrell Steinberg (D-Sacramento) to exempt fossil fuel producers from California’s cap-and-trade system, and instead impose a carbon tax on fuels. Even though this move (in theory) might make energy prices less volatile, it would still raise them, and thereby hurt California residents. Moreover, any carbon scheme (whether cap-and-trade or a straight tax) would do little to curb global carbon dioxide emissions.

AB 32 Review

In 2006, the California legislature passed and Governor Arnold Schwarzenegger signed AB 32, the Global Warming Solutions Act, which set a goal of reducing California’s greenhouse gas emissions to 1990 levels by 2020. In 2011 California enacted a cap-and-trade program to help achieve this ambitious goal.

The actual regulations are outlining the program are hundreds of pages, but the gist is simple: Starting in 2013, specified companies (about 350 businesses) in California must pay the state government for “allowances” (permits) that entitle them to emit carbon dioxide within the state. By reducing the quantity of allowances over time (about 3 percent per year, from 2015 through 2020), California’s government can increase their price, and force emissions to match the desired targets (assuming the authorities enforce the regulations strictly enough to achieve compliance).

Cap-and-Trade versus Carbon Tax?

In the academic literature, a cap-and-trade program and a carbon tax have largely the same effects, so long as they are calibrated properly. If the legislature sets the quantity of allowances so that the resulting market price of a ton of emissions is $10, then this system will have (to a first approximation) the same impact on the economy as a carbon tax set at $10 per ton. Now it’s true, there are subtle arguments by which professional economists favor one approach versus the other, having to do with our uncertainty about the size of the (alleged) “negative externality” and consequently whether it’s better to get the “carbon price” a little bit wrong, versus getting the “emissions quantity” a little bit wrong.

Steinberg himself hits upon a related justification for his proposal, when he argues that a carbon price (which would start at 15 cents per gallon) would provide more predictability in gasoline prices for consumers, rather than bringing oil companies into the cap-and-trade program next year (as will happen under the status quo). Opponents of his proposal argue that foisting such a hodgepodge (with the cap-and-trade applying elsewhere and the new carbon tax applying just to fossil fuels) on the state would actually make prices more volatile.

Without knowing the exact details of how these programs will play out—especially considering that once the genie is out of the bottle, future legislatures can always tinker—it’s hard to see which approach will make prices fluctuate the most. But what is certain is that both approaches—cap-and-trade versus carbon tax—will make gasoline and other fossil-based energy more expensive.

Moreover, because the program only applies to the state of California, it will largely by a symbolic gesture. Even stopping all United States emissions immediately would have a negligible impact on global temperatures in 2100 relative to the “business as usual” baseline, because the vast majority of growth in emissions is expected to come from China, India, and other emerging economies. Consequently, imposing a plan that would merely cut California’s emissions won’t register at all.

Tax Reform Hopeless

Another lesson from the Steinberg proposal is that it is quite naïve for “conservative” proponents of a carbon tax to keep telling us how much economic growth we’ll get from using its receipts to offset other taxes. This is because in practice, the revenues from a carbon tax (or cap-and-trade) will be used to fund “green” projects. For example, the LA Times articles explains:

The plan Steinberg outlined would take oil companies out of the cap-and-trade system, requiring them instead to pay a carbon tax of 15 cents a gallon next year. Two-thirds of the money raised by the tax, which would increase to roughly 24 cents in 2020, would be doled out to California families earning less than $75,000. The rest would be used to support mass transit.

Thus, right off the bat we see that one-third of the revenues from the tax would go to increased spending, and even the two-thirds earmarked for return to taxpayers would be distributed in a lump-sum transfer to low-income families. This might be fair on ethical grounds, since such families will be hit hardest by a hike in energy prices, but it is not at all what the “supply-side” proponents of a carbon tax need—they tell Americans how great it would be if carbon tax receipts were used to lower the top marginal income tax rates.

Conclusion

The fact that a leading member of the California legislature wants to tinker with AB32 so soon into its operation just shows that we will never get the “policy certainty” that advocates of a carbon pricing scheme promise. The only thing consumers can really count on, is that they will be paying more at the pump and to their electrical utilities if AB32 stays in place.

IER Senior Economist Robert P. Murphy authored this post.

Administration Procedure Flips Economic Growth and “SCC” Relationship

Last summer I testified before a Senate subcommittee on the numerous problems with the estimates issued by the Administration’s Working Group on the Social Cost of Carbon. The Working Group’s estimates of the “social cost of carbon” were artificially inflated because of several modeling decisions that it made, including the very significant omission of a 7 percent discount rate as OMB (the regulatory overseers within the White House) requires (as I explain here).

In the current post, I walk through another quirk in the Working Group’s procedure, which ends up yielding a very counterintuitive result: The richer we expect our grandchildren will be, the more the Working Group’s approach would impoverish ourselves today on their behalf. This is just another example of how the Working Group’s use of computer simulations ends up producing a nonsensical outcome.

Background: The “Damage Functions” in Computer Models

The Obama Administration’s Working Group chose three computer models from the economics of climate change literature—specifically, the DICE, FUND, and PAGE models—in order to estimate the social cost of carbon (SCC). The SCC is supposed to represent the present discounted value (in dollar terms) of the net harms that an additional ton of emitted CO2 will wreak on humanity over the coming centuries, because of climate change.

As we explained in this post, these computer models use a “damage function” that transforms temperature increases into an estimated percentage reduction of future GDP. In other words, the models don’t take a stipulated amount of warming—let’s say 3 degrees Celsius by the year 2100—and then directly spit out such-and-such trillions of dollars (at that time) in forecasted damages from climate change. Instead, the computer models have different estimates of the fraction of the potential economic output that will be forfeited in the year 2100, if humans allow (say) 3 degrees Celsius of cumulative global warming.

Combining Damage Functions With Growth Forecasts

In our earlier post, we quoted from an MIT professor who explained that this approach to the computer models’ “damage functions” was based on arbitrary mathematical relationships that had no basis in theory or empirical observation. However, for our purposes now, there is a separate problem: Since the damage functions are expressed as a fraction of global GDP, it means that the actual dollar value of estimated future damages from climate change are proportional to the forecast for economic growth. This will lead to an absurd outcome, but to see why we need to develop the logic of the Working Group’s output.

First let’s walk through (a portion of) Table 2 from the Working Group’s 2010 report:

The portion of Table 2 that we have reproduced shows the different scenarios that the Working Group used when forecasting the trajectory of global CO2 emissions and economic growth. There are several different scenarios with interesting names (“MERGE,” “IMAGE,” etc.), the last of which shows what would be needed to maintain average atmospheric concentrations of CO2 at 550 parts per million (ppm).

For our purposes in the present blog post, let’s just focus on the MERGE scenario in Table 2. Notice that by the year 2100, there is far greater annual emissions of CO2 in MERGE than in any other scenario; the MERGE scenario projects 117.9 gigatons of emissions in the year 2100, while the second-highest scenario (MiniCAM) projects only 80.5 gigatons, a level that is only 68 percent as high as emissions in MERGE.

On the other hand, if we look at the second section in Table 2, we see that MERGE has by far the smallest growth in total economic output. By the year 2100, the MERGE scenario projects global GDP of $268 trillion (in 2005$), while the second-lowest is MESSAGE at $334.9 trillion, which is about 25 percent higher than economic output under MERGE.

So let’s put these two facts together: Relative to all of the other scenarios, the MERGE scenario assumes much higher growth in global CO2 emissions over the coming decades, while it simultaneously projects that people in the future will be much poorer.

Intuitively, what would we expect the “optimal” approach to carbon policy to look like, across these various scenarios? Since more emissions leads to higher global temperatures, and since global climate change damages grow worse more than proportionally as the temperature increases (in the simulated world of the models), we would expect the damage inflicted on humanity under the MERGE scenario to be far worse than in the other scenarios.

At the same time, because the MERGE scenario assumes our descendants in future generations will be much poorer than in any of the other scenarios, we who are alive today ought to be willing to sacrifice more of our economy in order to help them. Remember that the conventional wisdom calls for limiting carbon dioxide emissions in the near-future, thereby making uspoorer, in order to spare our grandchildren from excessive climate change damages. The richer our grandchildren will be, therefore, the less willing we should be to sacrifice our wealth in order to make them even richer.

Putting these two intuitive notions together—that under the MERGE scenario, (1) there will be far more human-caused global warming, and (2) our descendants will inherit a much smaller baseline economy—we would expect that government policytoday would penalize carbon-intensive activities the most under the MERGE scenario, compared to the other scenarios. In particular, since the whole point of the Working Group is to generate “social cost of carbon” (SCC) estimates to guide policymakers, we would expect that the SCC would be highest under the MERGE scenario.

Working Group Gives the Opposite Result

To review: In the previous section, we established that of the various scenarios the Working Group plugged into its three representative computer models, the “MERGE” scenario assumed (by far) the highest level of CO2 emissions, and the slowest pace of baseline conventional economic growth.

Intuitively, we therefore would expect the Working Group to recommend placing the most pressure on scaling back emissions today for the MERGE scenario, relative to the others. Since the SCC represents the “benefit” of cutting back on a unit of emissions, that means we intuitively expect the Working Group to report the highest values of the SCC (for a given discount rate) for the MERGE scenario, compared to the others. And yet, as Table 3 (again taken from the Working Group’s 2010 report) shows below, the opposite holds true:

As Table 3 shows, MERGE has the lowest estimated social cost of carbon (SCC) for a given discount rate, among all of the scenarios that the Working Group considered. For example, using a 3 percent discount rate, MERGE estimated an SCC of $22/ton, which was lower than the $24.90/ton corresponding to the 550ppm scenario—in which emissions (eventually) fall off drastically, limiting total global warming and thus (at least one would have thought) containing the impact of climate change.

The Mystery Explained

What is going on here? How could the Working Group’s procedure be spitting out results that seem to be the opposite of common sense?

A big part of the answer is that the “damage functions” of the three computer models don’t project actual monetary damages from, say, rising sea levels or worse crop yields. Instead—to repeat what we said earlier—they model the impact of global warming as a percentage of lost potential global GDP. Therefore, other things equal, because the MERGE scenario projects much slower economic growth, the future damages from climate change—when measured in absolute dollars—will of course be much lower than in the other scenarios, where baseline absolute global GDP is so much higher.

In principle, the economic analyst could account for this subtlety by appropriately adjusting the discount rate. In other words, the richer we think people will be in the year 2100, the higher the discount rate we should apply to damages (measured in 2100 dollars) they suffer from climate change, in order to decide how much we should be prepared to sacrifice today on their behalf.

Unfortunately, in practice arguments over the discount rate occur in a separate compartment. For (a spurious) “consistency,” most analysts would probably look at Table 3 above and think that we have to pick a given discount rate, then move down the relevant column to read off the various estimates of the SCC under various scenarios.

Yet as we have explained, such a move is illegitimate. If we change the scenario, then we are changing the assumptions about economic growth, and thus should alter our choice of discount rate accordingly.

Conclusion

If we take the framework of the Working Group on the Social Cost of Carbon at face value, we would intuitively expect that a scenario involving more emissions and slower (baseline) economic growth would imply the most aggressive action against emissions today. On the flip side, we would expect a scenario involving modest emissions and robust economic growth to imply modest impediments to emissions today, because our grandkids won’t have as big a climate change problem to deal with, and they will be so much richer than us.

Yet as we explained in this post, the actual procedure used by the Working Group (and those who follow their recommendations) yields the opposite outcome: It generates the lowest estimates of the social cost of carbon for the scenario where—according to the logic of the exercise—we would want to see the highest estimates. This is just another example of how, in practice, the Working Group on the Social Cost of Carbon is generating outputs that defy common sense.

IER Senior Economist Robert Murphy authored this post.

PYLE: Camp’s Proposal is Encouraging

WASHINGTON — American Energy Alliance President Thomas Pyle issued a statement today on House Ways and Means Chairman Dave Camp’s (R-MI) draft tax reform legislation. Chairman Camp’s plan would repeal a number of green energy tax incentives, including the wind Production Tax Credit (PTC). Pyle’s statement reads:

“Chairman Camp’s proposal is an encouraging step toward common sense tax reform that Americans deserve. The Chairman should be commended for his efforts to save taxpayer dollars by ending wasteful green energy subsidies, including the wind PTC. 

“The expiration of the PTC at the end of 2013 was a victory for taxpayers. Now, Chairman Camp’s attempt to slash this wasteful handout from the tax code provides another positive sign to the American people. For over twenty years, the PTC has put an unnecessary burden on taxpayers by forcing them to prop up the self-proclaimed ‘infant’ wind industry. The wind industry demands ‘policy certainty’ for wind energy subsidies. Chairman Camp’s plan answers those calls by providing certainty that taxpayers will no longer be forced to foot the bill for Big Wind.”

###

Pyle Urges Wyden to Examine PTC

WASHINGTON — American Energy Alliance President Thomas Pyle sent a letter today to Sen. Ron Wyden (D-Ore.) urging him to initiate a full inquiry into past extensions of the wind Production Tax Credit (PTC). As the new Chairman of the Senate Finance Committee, Sen. Wyden has already expressed support for a tax extenders package that includes the wind PTC. American taxpayers deserve to know the full economic impacts of wind subsidies before Sen. Wyden even considers tax extenders. The letter reads:

As you know, Congress voted last year to give the wind industry another year of subsidies, which the Joint Committee on Taxation pegged at a $12 billion cost to taxpayers. Congress has a duty to consider the full impacts of last year’s expansion of the wind Production Tax Credit before committing billions more to an industry whose technology former Energy Secretary Steven Chu labeled “mature.”  In fact, AWEA proudly boasts that at the end of last year, “there were more U.S. wind power MW under construction than ever in history.”

The American people deserve a full airing of the cumulative economic impacts of wind subsidies, and the Senate Finance Committee has a unique responsibility to assess these impacts. We urge you, therefore, to initiate a full inquiry into the success of past extensions of the wind PTC to determine if taxpayers have received the benefit that policymakers and wind lobbyists promised.  We would also urge the Committee to investigate the other nations who have subsidized and/or mandated renewable energy and are now rapidly moving away from this model because of skyrocketing consumer energy costs.  By examining the experience of others, the U.S. may be able to avoid some of the same mistakes.

The American Energy Alliance welcomes a full discussion of this matter before the Senate Finance Committee and would eagerly participate in a meaningful conversation about the merits of the wind PTC, specifically, and all energy subsidies in general. A common sense U.S. energy policy should be guided by solid facts, sound science and an assessment of the impacts of these policies to the marketplace and the costs to consumers.

To read the full letter, click here.

###

Introduction to Petroleum Coke

What is Petroleum Coke?

Generally, people think of oil refineries as producing gasoline and diesel fuel, but refineries produce much more than just those two fuels. One important product produced by refineries is something called petroleum coke, or “petcoke.” Petcoke is used as a fuel and as a source of carbon for industrial processes.

How is Petcoke Produced?

Petcoke is produced after crude oil undergoes two processes. First, the oil is distilled into various products, separating out the light parts of the oil—the gasoline vapors, liquid petroleum gas (LPG), naphtha, and kerosene from the heavier parts of the oil. The heavier portion of the oil in then processed through a “coker” which subjects the remaining oil to high heat and pressure to exact as much of the lighter gasoline-like parts of the oil as possible. What remains after the coker after the high heat and pressure is a substance called petroleum coke.

Source: Wikimedia author romanm

What Are Petcoke’s Uses?

Petroleum coke is high in carbon—this makes it chemically similar to coal and both energy dense and useful for many other industrial processes that require carbon.

About 80 percent of petcoke is used as fuel. While petcoke is similar to coal, petcoke generates just 0.2 percent of America’s electricity, while coal generates nearly 40 percent. Instead, petcoke is usually used as a fuel to make cement, lime, brick, glass, steel, and fertilizer as well as many other industrial applications.

Much of the rest of the petcoke is “calcined petroleum coke.” Calcined petcoke is petcoke that is again heated to remove moisture, volatile matter, and impurities and to increase the electrical conductivity. Calcinced petcoke is used to make steel, graphite and titanium.

Calcined petcoke is essential to the creation of aluminum. Because of its high carbon purity and a lack of contaminants, calcined petcoke provides the only economically viable method to produce primary aluminum. Calcined petcoke also produces titanium dioxide, a safer alternative to the lead used in paint.

Why is Petcoke Important?

Demand for U.S. petcoke is rising, with China, Mexico, Japan, Canada, India and Turkey as the largest importers. China, for instance, imported 3.2 million barrels of petroleum coke from the U.S. in this past April alone, their third largest monthly volume of all time.

America became a net exporter of petroleum products in 2011 and the exports of petroleum coke is one of the reasons. As the next chart shows, the U.S. exported 184,167,000 barrels of petcoke in 2012, a nearly 30 percent increase since 2009.

Coal is one of the most affordable and abundant sources of energy for electricity generation. But international coal prices are often higher than U.S. petcoke prices, making U.S. petcoke an attractive option for many countries to use as a fuel.

Growing demand in developing countries, coupled with affordable prices, has enabled U.S. petcoke to emerge as a valuable export for the U.S. and a cost-effective analogue for coal for much of the rest of the world.

Conclusion

Whether or not petcoke will continue its rise in global energy markets remains to be seen. But barring any significant changes in federal regulatory practices, America can only stand to gain from petcoke’s continued production and exportation. Because petroleum coke is a product of petroleum and is chemically similar to coke, some special interest groups are trying to demonize petroleum coke on environmental grounds. In future posts, we’ll explore the environmental considerations surrounding petroleum coke.

 

Biofuel Industry Gets Rich at the Expense of World's Poor

By their very nature, government policies that artificially encourage the use of “biofuels” (such as ethanol) distort resource allocation and make consumers poorer than they otherwise would be. Of course, fans of the free market have been leveling such a criticism against biofuel mandates and subsidies from the beginning.

Yet over the last few years, even many left-leaning groups have realized the harm of these programs. In particular, biofuels divert agricultural products and lands out of food production and into energy production, thereby driving up food prices. We have already written here about the Renewable Fuels Association’s celebratory document explaining the boost to farm income (and higher prices for consumers) provided by the Renewable Fuel Standard (RFS). But now a new scholarly article by Brian Wright in The Journal of Economic Perspectives provides additional insight into the connection between biofuel policies and food prices.

Wright is Professor of Agricultural and Resource Economics at UC Berkeley. His paper explores the reasons for the large increase in prices of wheat, rice, and corn over the last five years, which is at first puzzling since these are textbook competitive markets. Wright argues that “[t]he price jumps since 2005 are best explained by the new policies causing a sustained surge in demand for biofuels.” In other words, subsidies and mandates for biofuels are making food more expensive, particularly for the poorest among us. He goes on to write:

The rises in food prices since 2004 have generated huge wealth transfers to global landholders, agricultural input suppliers, and biofuels producers. The losers have been net consumers of food, including large numbers of the world’s poorest peoples. The cause of this large global redistribution was no perfect storm. Far from being a natural catastrophe, it was the result of new policies to allow and require increased use of grain and oilseed for production of biofuels. Leading this trend were the wealthy countries, initially misinformed about the true global environmental and distributional implications.

Wright’s analysis shows that the problem of biofuel policies is not limited to the United States; wealthy countries around the world are embracing these flawed policies in the name of protecting the environment. Yet perversely, the full impact of these policies falls on the poorest people of the world, who are hurt the most by increased food prices.

Even groups who initially supported the RFS and similar policies are realizing its perverse consequences. We have earlier discussed the ethanol “blend wall” which threatens to actually harm vehicle engines and make fuel more expensive for American motorists. Now we can add increased global food prices to the list.

IER Senior Economist Robert P. Murphy authored this post. 

Keystone XL: Fifth and Final Environmental Review is Favorable

While the State Department’s latest report on the environmental impact of the Keystone XL pipeline released late last week is favorable to its construction, President Obama is still not so sure[i] and there are several steps to be undertaken before a decision will be made. First, is a comment period lasting 30 days on the report’s findings from the public and a 90-day period for comments from other government agencies. That puts the next milestone into May, at the earliest, when Secretary of State John Kerry is expected to make a recommendation to the President as to whether the pipeline is in the national interest. So, the summer is the earliest when an answer might be expected from the President. Given his past history on the subject, however, delays until beyond the mid-term elections will likely be the case bringing the ‘studying’ of the national merit of the Keystone pipeline to over 6 years.

The Latest State Department Report

The 11-volume Final Supplemental Environmental Impact Statement[ii] concludes that the Keystone XL pipeline would have no marginal effect on climate or oil and gas development in the Alberta oil sands because the resources would be produced anyway irrespective of the President’s decision. The only difference is how the oil sands would be transported.

In a scenario where the oil sands is transported by rail and tanker, 27.8 percent more greenhouse gas emissions are emitted than if the pipeline were constructed. If the oil sands were to be moved by train to existing pipelines, 39.7 percent more greenhouse gas emissions would result. And, if the oil sands were to be transported solely by rail to the Gulf of Mexico, 41.8 percent more greenhouse gas emissions would result.[iii]

extra-KXL-graph

The surge in rail movements of oil can be seen by these statistics:  In 2009, 9,500 carloads of oil were moved by rail compared to almost 234,000 carloads in 2012, an increase of almost a factor of 25. And, in 2013, it is estimated that around 400,000 carloads of oil were moved, about a 70 percent increase from 2012.

Further, according to the study, replacing the Keystone XL pipeline with rail from Canada could result in an average of six additional rail-related deaths per year. Using data from the Federal Railroad Administration  and the Pipeline and Hazardous Material Safety Administration, shipping 830,000 barrels per day of oil “would result in an estimated 49 additional injuries and six additional fatalities for the No Action rail scenarios compared to one additional injury and no fatalities” per year if Keystone XL is built instead. The “No Action” scenarios analyze the likely situation if the Northern portion of Keystone is not built.[iv]

Also, according to the report, shipping oil by rail, instead of by pipeline, is expected to result in a higher number of oil spills and a larger amount of leakage over time. If Keystone XL is built as planned, it is expected to spill an average of 518 barrels per year, with a leak occurring once every two years. Under the most optimistic scenario involving rail, over 1,200 barrels are expected to be spilled each year from nearly 300 spills.

The State Department expects the Keystone project to create 42,100 direct, indirect, and induced jobs, many of which are union jobs. About 3,900 of those jobs are expected to be temporary construction jobs. Once built over a 2-year period, the pipeline would support 50 jobs directly. Keystone is expected to contribute about $3.4 billion to the economy (about 0.02 percent of GDP).[v]

The State Department report did evaluate some scenarios where Canada’s oil sands would not be produced in their entirety based on a future low price for world oil.  If the price of world oil were to fall to between $65 and $75 per barrel, the higher cost of rail shipping compared to pipeline shipping could make some oil sands unprofitable. Or, if the price were to drop to below $65, then a larger amount of oil sands could be unprofitable to produce with or without the pipeline. However, these scenarios are not very likely particularly since the reason for any speculation in low oil prices is due to the boom in North American crude, of which oil sands is a part.[vi]

Barge Traffic Increases

It is not just rail that is benefiting from a lack of sufficient pipeline capacity to move oil. Oil moving on barges on the Mississippi River from the Midwest to the Gulf of Mexico has increased by a factor of 13 since 2010. According to federal data, almost five million barrels of oil a month is being shipped by barge from North Dakota’s Bakken Shale and from Canada’s oil sands. Barges also are moving oil around the Gulf and on the East and West coasts, providing links between pipelines and railroad terminals and refineries.[vii]

While barges carry less oil than railcars, are slower, and are more limited to routes, they are less expensive and can fill gaps easily in the logistics chain. An oil barge generally carries 30,000 barrels of oil — less than half the volume of a 100-car oil train.

Oceangoing barges also are moving oil. Waterborne shipments of Eagle Ford shale oil from the Port of Corpus Christi in Texas not only move along the Intracoastal Waterway to refineries on the Gulf Coast, but they also supply oil to New Jersey and Canada’s eastern coast. According to Clipper Data LLC, barge and tanker traffic from Gulf Coast ports to East Coast refineries in the United States and Canada increased by almost a factor of 10 last year. In December, the amount of oil shipped to the Atlantic Coast doubled from August, reaching 1.4 million barrels.

Increased traffic is adding to the profits of barge operators. For example, Houston-based Kirby Corporation, which is the biggest operator by fleet size, reported a record profit of $64.3 million for the fourth quarter based on revenue of $568.4 million. This year, Kirby is adding 37 inland barges to its fleet and one that can travel on the open sea at a cost of $90 million.

MK-CJ762_BARGES_G_20140202180904

Source: Wall Street Journal

Conclusion

The State Department is finding the Keystone pipeline as a favorable asset in its latest environmental impact statement, which should pave the way for the President to find Keystone in the national interest by this summer. But, President Obama says “not so fast” as comments from the public and other federal agencies are the next steps in the process that has now taken 5 and a half years. But, as we all know, oil sands will be moved regardless of the decision, whether to the United States or Asia, whether by pipeline, rail, barge or tanker. Americans should want and expect the decision to be the safest and cheapest means possible.


[i] National Journal, White House to Keystone Advocates: Not So Fast, January 31, 2014,http://www.nationaljournal.com/energy/white-house-to-keystone-advocates-not-so-fast-20140131

[ii][ii] U.S. Department of State, Final Supplemental Environmental Impact Statement, http://keystonepipeline-xl.state.gov/finalseis/index.htm

[iv] Reuters, Without Keystone, oil trains may cause six deaths per year: U.S. State Department report, February 2, 2014,http://www.reuters.com/article/2014/02/03/us-keystone-rail-idUSBREA1201Z20140203

[v] Washington Post, Five takeaways from State Department’s review of the Keystone XL pipeline, January 31, 2014,http://www.washingtonpost.com/blogs/wonkblog/wp/2014/01/31/four-takeaways-from-the-state-departments-review-of-the-keystone-xl-pipeline/

[vi] Council on Foreign Relations, The Most Important Part of the Keystone Environmental Impact Statement, February 1, 2014, http://blogs.cfr.org/levi/2014/02/01/the-most-important-part-of-the-keystone-xl-environmental-impact-statement/