Scenarios in a Post-Copenhagen World

Yale economics professor William Nordhaus has published a new analysis of alternative outcomes for emissions, climate change and potential damages under different policy scenarios after the Copenhagen Accord using a integrated model that divides the world into 12 regions and incorporates pertinent economic data, key geophysical inputs, including carbon dioxide emissions, the carbon cycle, radiative forcings and a simple climate model, and a series of policy scenarios, including baseline, an optimal approach that maximizes economic welfare with full participation by States, a temperature-limited scenario that limits temperature increases to 2C above 1900 levels, and a Copenhagen Accord scenario, Nordhaus, Economic Aspects of Global Warming in a Post-Copenhagen Environment, 107 PNAS 11721-11726 (2010) (open access).

Among the key take-aways from the study:

  1. An optimal path scenarios would require a cut in global emissions of 50% from 2005 in 100 years, while the temperature-limited path would mandate zero emissions by 2075;
    • Atmospheric concentrations of carbon dioxide would rise to 793 ppm by 2100 under a baseline scenario; under the optimal and temperature-limited paths, atmospheric concentrations would peak at between 500-600ppm;
    • Under the baseline scenario, global temperatures rise 3.5C in 2100 and 5.7C in 2200, and ultimately peaks at 6.7C; the peak for the temperature-limited scenario is obviously 2C and 3C for the optimal path
    • Under the Copenhagen accord atmospheric concentrations of carbon dioxide rise substantially above levels consistent with capping temperatures at 2C
  2. Carbon prices under the optimal and temperature-limited scenarios are $38 and $79 per ton by 2015, compared to average of $5 per ton;
  3. One of the most valuable aspects of the piece is the “cautionary notes” section, outlining the economic disincentives for effective, concerted international action. This includes the following:
  • If each of the 12 regions acts non-cooperatively, carbon prices are approximately 1/10th of optimal levels;  thus, countries have strong incentives to free ride by not participating or cheating in the context of climate change agreements, a class tragedy of the commons scenario;
  • Another barrier to escaping from a “low-level noncooperative equilibrium” is the “intertemporal tradeoff,” i.e. effective climate change policy requires costly short-term abatement measures while many damages will occur in the distant future. Through 2055, the ratio of costs to benefits is 5-1; after this, the ratio is reversed, with benefits outweighing costs by 4.1. Asking the current generation to reduce its standard of living for the benefit of future generations would require “a level of political maturity that is rarely observed;”
  • There is also “spatial asymmetry between winners and losers,” with the major emitters facing a price tag of more than $1 trillion in discounted costs by 2055. While poorer countries can make arguments based in equity for why this outcome is fair, politics will invariably result in richer countries seeking to “weigh their own costs and attempt to share the burden more widely.”

Nordhaus’s models have been hotly contested by many in the past, so there should be ample opportunities in more advanced classes to “look under the hood” of the model he employs to critique some of the underlying assumptions which heavily influence his conclusions. Also, some good discussion with students could be generated around the question of how to cut through the Gordian knot of economic disincentives described in the article. For example, are there methods to obviate free riding? And is it entirely clear that the current generation must sustain a substantial financial hit to benefit future generations?

U.S. Executive Agency Responses to Climate Change

Robert B. McKinstry, Jr., Jennifer E. Drust and Brendan K. Collins of the firm Ballard Spahr have prepared an excellent 17-page memo that summarizes efforts to address climate change by the Environmental Protection Agency, the Council on Environmental Quality and the Securities & Exchange Commission. This would be a very good student reading at the law school level in that it’s both a great compendium of regulatory responses to date, as well as a good critique of these approaches.

Sea Level Rise and Coastal Impacts

A new study on sea level rise trends and potential impacts and was published last week in Science, Robert J. Nicholls & Anny Cazenave, Sea-Level Rise and Its Impact on Coastal Zones, 328 Science 1517-1520 (2010) (subscription required).

This would be an excellent student reading because it updates the assessments of the IPCC’s 4th Assessment and reminds students that non-climatic factors can work in tandem with climate change to produce negative impacts. Among the article’s take-aways:

  1. While the IPCC’s 4th Assessment Report projected that global seal level could rise up to 60 cms., it failed to adequately account for ice sheet dynamics. Recently identified accelerated decline of polar ice sheet mass now raises the prospect of sea level rise of 1 meter by 2100;
  2. While sea level rise was nearly stable since the end of the last deglaciation, sea level rose by an average of 1.7 mm/yr. since 1950, and this accelerated to 3.3 mm/yr. from 1993-2009;
  3. Thermal expansion accounts for about 50% of sea level rise from 1993-2003; the glacial contribution to sea level rise from 1993 to 2009 may be 30%. A substantial non-climatic contributor has been factors e.g. ground subsidence due to oil and groundwater extraction, irrigation, and most importantly, intensive dam building along rivers, lowering sea level by about 0.5 mm/yr. in the 20th Century;
  4. Greenland and West Antarctica mass loss is accelerating, and contributed to 15% of global sea level rise from 1993-2003, with that contribution doubling since 2003;
  5. The largest unknown factor in future sea level rise is the behavior of ice sheets. Future ice dynamics could result in sea level of 80 cms. by 2100, with overall projections of one study of sea level rise of 30-180 cms. by 2100, this upper limit obviously being far above the 4th Assessment projections;
  6. While immediate effect of sea level rise is submergence and increased flooding of coastal lands and saltwater intrusion of surface waters, longer-term effects include increased erosion and saltwater intrusion into groundwater, declines of coastal wetlands such as saltmarshes and mangroves;
  7. In many areas, non-climatic relative sea level rise predominates; including in most river deltas;
  8. Most countries in South, Southeast and East Asia are highly threatened by sea level rise because of widespread occurrence of densely populated deltas. In Africa, there are serious threats due to low levels of development combined with expectations of rapid population growth in coastal areas. Small island states are, however, the most threatened nations, with some facing “the real prospect of submergence and complete abandonment during the 21st century;”
  9. Adaptation can ameliorate impacts under some circumstances; however, protection can also attract new development in low-lying areas, increasing risks.

The Limitations of the EPA’s RFS2 program

By Simon Bird, Environmental Accountant at AgRefresh

The US national policy for biofuels has been advancing at a fast and furious pace. Every day new research fights for our attention, heralding technological advances or new impact assessments. A successful national biofuel promotion program would be fluid, allowing for the incorporation of this new research and would promote the development of increasingly sustainable technologies and practices. However, the EPA’s RFS2 program is a static being that will not allow for changes to be incorporated easily, whether great technological leaps or incremental improvement in feedstock production. The program additionally flattens the playing field among the diverse range of fuels, removing any incentive for fuel producers to reduce impacts or move to less environmentally-damaging feedstocks or processes.

This is a result of the EPA’s focus on mandated volumes, and forcing all fuels into one of four set thresholds based on the EPA’s own analysis of national data, instead of the biofuel producers individual processes. If instead the EPA would allow fuel producers to utilize existing LCA software systems, they would be able to conduct full carbon assessments of their fuels, incorporating the entire lifecycle of the product and any indirect effects. The producers’ LCA would then be used to give the fuel a carbon label, which biofuel distributers or users would use to keep under a mandated carbon cap. A system like this would give biofuel producers and feedstock producers a real market-based incentive to improve their processes and reduce the carbon intensities of their biofuels. In this system the “grandfathering” of existing ethanol refineries using coal as a fuel source would no longer be an issue, the marketplace would quickly drive biofuel producers to much less carbon intensive fuel production technologies and fuel sources.

One example of the negative effects of the static nature of this legislation is that Congress mandated that the baseline for GHG emissions reductions be the year 2005. The result is that the GHG emissions of future biofuels are being compared against the historic GHG emissions level of petroleum fuels, a far cry from an “apples to apples” comparison. The mixture of petroleum fuels used in the US is constantly changing, and in future years will become increasingly carbon intensive as oil becomes scarcer and more energy is required for extraction and processing. A low carbon fuel standard using carbon labeling wouldn’t lock the country into using an arbitrary metric such as GHG reductions from petroleum in a historic year, but would compare all individual sources of fuel on an “apples to apples” basis every year.

A well designed low carbon fuel standard using a carbon labeling system would give the program flexibility to adjust to changing research and incentives to reduce the carbon intensity of fuels in comparison to the EPA’s RFS2. This would result in the movement to more sustainable feedstocks, and the incremental improvement in feedstock production methods and biofuel refining technology.

Simon Bird
AgRefresh Environmental Accountant
[email protected]
802.859.0099, ext. 6

Posted in Uncategorized | Leave a reply

Delivering GHG Reductions under Various Climate Policy Options

By Charles Kerchner, Senior Environmental Accountant at AgRefresh

With the economic concerns, health care debate, and climate skepticism delaying any immediate action on climate change policy in Washington, the door has been left open for other climate policy proposals. The question facing farm and forest landowners during this time of uncertainty is: How can the agriculture and forestry sectors deliver GHG reduction benefits under any policy option on the table?

To date, the lion’s share of attention has been given to the cap-and-trade policy (i.e., the Waxman-Markey bill and the Kerry-Boxer bill). Under a cap-and-trade system the agriculture and forestry sectors are expected to be uncapped sectors and are expected to deliver substantial quantities of offsets to capped entities at an early stage in the game.  This could represent a multi-billion dollar a year revenue stream for American foresters and farmers.  However, if a cap-and-trade system is not the chosen policy, there are still opportunities under the other options being considered.

One policy option that has received increasing attention recently is the “cap-and-dividend” plan highlighted by Senators Cantwell and Collins in their CLEAR Act bill.  At the core of CLEAR is an upstream cap on fossil carbon as it enters the economy with a hundred percent auction of permits to energy producers.  75% of the revenue from the auction is returned as a cash refund to every legal citizen. 25% of the revenue is directed to a dedicated trust, the Clean Energy Reinvestment Trust (CERT).   In direct contrast to cap-and-trade legislation, there is no use of domestic or international offsets for meeting compliance obligations. However, the CERT would fund cost-effective reduction/sequestration projects that verifiably reduce, avoid, or sequester GHG emissions.

Another policy option, though not a politically attractive alternative, would be a carbon tax.  This could mirror a similar program as a cap-and-dividend program where additional reductions are financed via tax revenue.

The key to delivering GHG reductions or sequestration outside a cap-and-trade policy is to keep the program as administratively simple and cost-effective as possible.  This can be done.  The advantage of a cap-and-dividend, carbon tax, or hybrid, unlike a cap-and-trade program, is that reductions coming from the agriculture and forestry sectors are not needed for compliance. They will be purely additional reductions beyond the already established cap or tax.  Thus, reductions may not be subject to the same offset rules, (i.e. additional, verifiable, registered, and permanent) as a cap-and-trade system, because they do not sacrifice the integrity of the emission reductions target.

What does this mean? In straight speak, it means that these reductions may be cheaper to deliver than under a cap-and-trade program.   Because they might not have to prove “additionality” and incur the many monitoring and verification costs as a cap-and-trade offset, they could be rolled into programs that already exist, such as the USDA’s Conservation Reserve Program or Conservation Security Program or newly established programs like Senator Shaheen’s bill that calls for USDA to administer a forest carbon incentive program.

There are many questions facing the farm and forestry sectors: How does the quantity of reductions compare among the policy options? How do the costs of reductions compare?  In sum, how does the net revenue for agriculture and forestry sectors compare among policy options? Because many of the details needed to answer these questions have yet to be determined, it is difficult to say. However, the next step will be to quantify the reductions that can be delivered under the different policies, examine the costs of the policies, and compare the potential revenue streams to the agriculture and forestry sectors.

Charles Kerchner

AgRefresh Senior Environmental Accountant
[email protected]
802.859.0099 ext.5

Models versus Common Sense

By Jeffrey Frost, Executive Director of AgRefresh

When the models we use generate answers which violate common sense, it is time to check the prevailing assumptions within. The Manomet study for Massachusetts, “Biomass Sustainability and Carbon Policy Study”, essentially concludes that global warming will be exacerbated by substituting forest biomass energy for fossil fuels for the production of electricity. I do not have to think very hard or long to conclude that digging up ancient carbon from coal, oil, and natural gas is unlikely to reduce greenhouse gas emissions as compared to growing renewable biomass in our forests and harvesting it for energy use. Yet, this is exactly what the Manomet study tells us, wrapped in massive amounts of very sophisticated analysis.

I have no wish to invalidate the strong work product produced by a stellar team regarding an analysis of prime importance: Will renewable forest biomass or fossil fuels best serve our needs for a low-carbon energy future (1)? I do suggest a need to examine the logic behind some of the prevailing assumptions – explicit and inherent – in this extensive analysis. Here then are questions which need to be answered:

  1. How do you choose the point in the growth and harvest cycle for forest biomass at which to begin the analysis? Manomet chose to begin the life cycle analysis with the day of harvest. If they had taken the other extreme and started the analysis the day after harvest (the first day of sequestration), the results would have flipped entirely and showed the huge benefits of biomass over fossil fuels instead of the reverse finding. Intuitively, the biomass must be grown and the carbon sequestered before it can be combusted anyway. The most defensible answer is probably to begin the analysis midway between harvests which will improve the relative status of biomass substantially.
  2. How does Manomet justify basing this study on the assumption that whole forest harvests will be used for biomass energy? The expert groups I have worked with refused to consider this extreme case because they considered it outside the bounds of reasonable likelihood. These other experts describe actual practices where forest thinnings and forest residuals and harvested biomass byproducts – along with agro forestry, and other purpose-grown biomass – are the current and expected future sources of bioenergy feedstocks.
  3. When the analysis assumes the use of biomass energy sources, how does Manomet justify excluding the avoided emissions benefits from not mining fossil fuels? The analysis does not assess the avoided greenhouse gas emissions – not to mention the huge other environmental, social, and economic consequences – from reductions in mountain-top coal mining, Gulf of Mexico deep water drilling, and natural gas hydrofracking. Similarly, the greenhouse gas emissions associated with the entire infrastructure of activities supporting our fossil fuel economy need to be considered. For example, we would be less likely to have the Middle East wars and the U.S. military effort to protect shipping lanes if we grew our own fuel supplies instead of importing oil?
  4. Which form of life cycle analysis is appropriate for this type of policy-informing analysis, attributional or consequential? Manomet does not specify which form of life cycle analysis they have used or the reasons they choose one over the other. Yet the differences in outcomes for this type of analysis can be material. It appears that consequential is the appropriate analysis framework for this study and that instead, attributional, the less appropriate form was implicitly chosen.
  5. Does the Manomet study give proper acknowledgement to the manner in which harvesting energy biomass and generating carbon credits produces supplemental income streams which will keep land in forests which may have otherwise been converted to urban development? Manomet anecdotally dismisses this issue by noting that current green biomass payments to forest owners are minimal. Yet this study, which is intended to inform policy development for coming decades, should consider the more robust biomass pricing which will emerge under future national renewable energy standards and cellulosic biofuel production. Similarly, the inevitable price of carbon from future policy will enhance carbon credits as a source of income. Avoided loss of forestland is an issue which needs a robust treatment beyond that received here.

This note is authored by a reader who confesses to having found time for only a brief scan of the Executive Summary and Chapters 5 and 6, the chapters most relevant to the carbon accounting issues. In the event the above five questions were answered in a satisfactory manner elsewhere in the study, apologies to the authors.

Jeffrey Frost
AgRefresh Executive Director
[email protected]
802.859.0099

1 Manomet and the author of this note would both assume that energy efficiency and other forms of renewable energy or low-carbon energy are important too. The purpose of Manomet analysis was keyed to a comparison between forest biomass and fossil fuels for electricity and thermal generation in Massachusetts.

Chinese Environmental Policies

Jonathon Porritt of the BBC has produced an excellent 23 minute documentary film on China’s progress in transforming its economy in terms of sustainability and eco-design.

A Methodology for Assessing the Merit of Adaptation Approaches

As it becomes increasingly obvious that we must develop effective adaptation responses to likely temperature increases of 3-4C during this century, it has become increasingly critical that we develop sounds methods to assess potential adaptation responses. A new brief by Rachel Berger & Muyeye Chambwere, Beyond Cost-Benefit: Developing a Complete Toolkit for Adaptation Decisions, IIED Briefing, June 2010 (3 pages), could be an excellent student reading for discussing several different options in this context.

Among the take-aways from the brief:

  1. While climate change could have serious impacts on lives and livelihoods in many developing countries, adaptation strategies “will be even more costly if they have unintended negative consequences;”
  2. For some community-based adaptation programs, alternative tools may be better, including cost-effectiveness, or economic impacts or valued as measured by stakeholders;
  3. There is a very wife range of cost estimates for adaptation, from $9 billion to 109 billion; in most cases adaptation for sectors where information is unavailable are not costed out, e.g. ecosystems;
  4. There are several issues that challenge the validity of traditional cost-benefit analysis:
  • Formal approaches e.g. cost-benefit analysis often rely on market values while planners are obliged to draw heavily on local knowledge if they are to achieve synergy with how local ecosystems and farming systems;
  • Some development NGOs argue that local people should choice their adaptation investment priorities, utilizing their own criteria. However, this means that values are established locally and outside of formal accounting processes;
  • Cost-benefit analyses often use materials and other resources available locally, often avoiding negative impacts on the environment; however, these impacts are often not taken into account in cost-benefit approaches;
  • Successful adaptation often requires iterative approaches, some of which will fail, and acquisition of new skills and knowledge. However, cost-benefit analysis usually labels these factors as costs, even though knowledge is valuable and failures are an inevitable part of the process;
  • An important part of the adaptation programs should be increasing adaptive capacity, which requires linked people into networks with access to information and resources. Again, social networks are a factor that cannot be easily assigned an economic value despite their value;
  1. The decision of countries on appropriate adaptation responses will be informed by factors beyond cost-benefit analysis, e.g. risk assessment. Also, while costs and benefits can be assessed in the context of specific capital investment projects, this is not the case outside such well-specified types of action. Yet we know in many cases, e.g. the threat of rising sea levels to small island States, that interventions are justified even without hard numbers to justify adaptation responses.

The article should generate some interesting discussion. Some pertinent questions include the following:

  1. If the lion’s share of adaptation funding is provided by developed countries, do they have the right to establish the metrics for assessment, e.g. cost-benefit analysis, or if adaptation funding is, as many developing countries argue, “reparations” for climate-related damages, do they have a virtually unbridled right to use such funds as they deem most judicious?
  2. Why do different assessments of adaptation costs yield such radically different cost estimates? What are the underling assumptions of different studies and how do we assess their validity?;
  3. What are some potential examples of maladaptation, and how do the metrics described here help us to avoid such results?

    Ocean Acidification and Potential Toll on Marine Life

    An excellent new article on ocean acidification was published last week in Science as part of a special section of the oceans, Richard Kerr, Ocean Acidification Unprecedented, Unsettling, 328 Science 1500-1501 (subscription required). The piece is appropriate for undergraduate, graduate and law students and provides a really good summary of research to date on “the other carbon dioxide problem” associated with burgeoning greenhouse gas emissions.

    Among the take-aways of the article:

    1. While there are some certainties in this field, including the fact that carbon dioxide emissions are resulting in gigatons of acid lowering the pH of the world, substantial uncertainties remain in terms of ecosystem impacts. Notably, there’s nothing in the geologic record as severe as the current plunge in pH. Laboratory studies, however, have revealed that corals always do poorly, though the impact on other organisms is more mixed. For example, a recent study found a 30% reduction in shell thickness of one species of roam in the Southern Ocean, and most other non-coral calcifiers also demonstrate slowing carbonate building. However, a few species, including certain coralline red algae and echinoderms have shown increases in carbonate building rates;
    2. Ocean pH is now lower than it has been for 20 million years, and is heading lower. Current projections are that pH will drop from 8.2 pre-industrial to 7.8 by the end of the century, which would increase the surface oceans acidity by about an average of 150%;
    3. The closest analog to what is occurring currently in terms of ocean acidification is the Paleocene-Eocene Thermal Maximum (PETM) 55.8 million years , with comparable amounts of of carbon dioxide and methane emissions (which quickly oxidizes to carbon dioxide). The startling contrast is that the rate of release of greenhouse gas emissions in the current era is 10x faster than in the PETM. This likely will make a profound difference in terms of impacts. In a thousand year period, sediments in the oceans can neutralize added acid, which explains why the massive release of GHGs during the PETM only resulted in the extinction of tiny shell-forming organisms on the deep floor.  By contrast, “today’s emissions are so rapid that they are piling up in surface waters;
    4. While there is some funding for ocean acidification research, including $5.5 million on the 2009 Federal Ocean Acidification Research & Monitoring Act, much more is required for development of a National Ocean Acidification Program in the United States.

    New publication on “fast track” financing under the Copenhagen Accord

    For instructors looking for good assessments of developments since the Copenhagen Accord was established at COP15 of the UNFCCC, the International Institute for Environment and Development released an excellent paper this week looking at the track record to date of developed country parties in providing the $30 billion in “fast start” financing for mitigation and adaptation needs of developing States through 2012.

    Among the take-aways of the report:

    1. While both the Stockholm Declaration in 1972 and government representatives at the Rio Conference in 1992 pledged “new and additional” finance resources to foster sustainable development in developing countries, very little came of these pledges;
    2. Both the UNFCCC and the Kyoto also pledged new and additional financial resources, but the failure of developed countries to deliver has exacerbated mistrust between developed and developing country parties;
    3. In assessing the pledges for fast starting financing under the Copenhagen Accord, as well as the longer term pledges of $100 billion annually by 2012 the authors suggest that the baseline for new funding could be set in 8 ways, most of which are flawed:
      • Funding should be construed as “new and additional” if it exceeds 0.7% of their gross national income for overseas development assistance. However, many countries, notably the United States, will never accept this threshold since their ODA contribution is much lower; moreover, countries that exceed the 0.7% level may simply divert ODA above that and claim it’s new and additional financing;
      • Having no agreed baseline is an unacceptable approach because it precludes transparency, which would obviate efforts to build trust between the North and South in climate fora;
      • Only funds channeled through dedicated institutional mechanisms, e.g. the new Green Climate Fund or the Adaptation Fund could count as new and additional, but in some cases it might be more appropriate to channel funds to other mechanisms, and the inflexibility might discourage contributions;
      • Another option would allow States to use the best channels and mechanisms, but would not count ODA money as climate finance; while a good solution, most industrialized countries find this approach unacceptable.
      • The baseline could be defined as existing funds pledged for climate finance and those pledged before Copenhagen. However, this would still permit diversion of ODA, and it’s difficult to distinguish old and new source of financing;
      • The amount of foreign assistance countries would be expected to provide in any given year could be assessed in the absence of new climate finance. Business-as-usual funding levels would be renegotiated every year, taking into account current economic growth and ODA commitments. While this would be an extremely sound way of making a “new and additional” assessment, it would be difficult to negotiate and wouldn’t do much to build trust since developed countries would always be suspected of “fixing” the baseline;
      • One way to avoid permanent renegotiation of baselines would be to utilize a baseline of predefined projections of development; of course, there would be a debate about the most realistic ODA growth path;
      • A final approach would combine all the issues of  newness, additionality and acceptability. “This baseline would count new sources only, meaning that only assistance from novel funding sources – such as international air transport levies, currency trading levies or auctioning of emission allowances – would be seen as new and additional.” Downsides of this approach are that it would bar the use of effective current funding streams, and would arbitrarily define which sources are new.
    4. Ultimately, the authors conclude that the last two options are the best in terms of avoiding potential loopholes or being too onerous. However, they emphasize that this doesn’t ensure new and additional funding. This also argues for the need for clear rules on monitoring, reporting and verification of funds. It might also be salutary to at least initially have reporting on funding channeled through a central entity to engender trust.

    While this reading is a bit “in the weeds” in terms of the issues it addresses, and thus might not be appropriate for introductory classes, it would be an excellent selection for more advanced courses. Moreover, it provides a very tangible example of how the legitimacy of regimes are critically dependent on whether their stakeholders perceive their mandates as fair and equitable.