ELI Report
Author
Akielly Hu - Environmental Law Institute
Environmental Law Institute
Current Issue
Issue
2

Adaptation in Action Institute shifts course for Kazakhstani officials online using Zoom, helping the country craft its Environmental Code

For more than three years, ELI has worked with Kazakhstan’s Department of Climate Policy and Green Technologies, amending the national Environmental Code to incorporate climate change adaptation.

The project falls under the U.S. Agency for International Development’s C5+1 National Adaptation Planning Program, which expands the capacities of Central Asian countries to engage in planning under the UN Framework Convention on Climate Change.

Although Kazakhstan faces a number of related environmental challenges — including increased aridity, desertification, and extreme weather events — the country has not yet accounted for adapting to climate change in its legal framework.

In 2019, ELI collaborated with climate specialists from Abt Associates to help the Kazakhstani government develop a draft chapter for its code, titled Public Administration in the Field of Adaptation to Climate Change.

The revisions, which set forth climate change adaptation norms and processes, as well as new competencies of governmental bodies, are currently under review by the national parliament. The lower house has already adopted the revisions. ELI also assisted in developing draft rules for implementing the proposed provisions and helped write methodological guidance on implementing the adaptation processes set forth in the law.

Once the proposed climate change adaptation provisions are adopted, Kazakhstani government staff will need to implement the law. Recognizing a need to build familiarity with the new provisions, ELI hosted an online training course to build the capacity of government staff. Held over 10 days last summer, the class was organized in collaboration with the national climate deparment, with financial support from USAID.

The objective of the course was to familiarize supervisors, experts, and staff with adaptation-related provisions of the code. The training also explained related rules and methodological guidance. Attendees received an overview of climate change impacts in Kazakhstan and the country’s international obligations under the Paris Agreement related to climate adaptation.

Lectures explored experiences in climate adaptation from other countries, as well as approaches — such as conducting vulnerability assessments — that can be used at various stages of the adaptation process.

Participants consisted primarily of government staff working in sectors relevant to adaptation, such as agriculture, water resources, forestry, and protection of citizens. Other participants represented NGOs, science institutes, and other stakeholders. A total of 166 participants attended the course, and 93 successfully passed the final test and received certificates of completion.

Originally intended as an in-person training course, the program transitioned to an online format over Zoom due to the pandemic. All the course lectures were recorded (either in Russian or recorded in English and dubbed into Russian), and recordings were made available to all the training participants, providing them flexibility in when to view the videos. Near the end of the course, the Institute also held a live discussion with the lecturers to provide training participants an opportunity to ask questions about the course materials.

The training session received enthusiastic feedback, both for its structure and content, and represented one of many ways ELI has innovated approaches to online events during the Covid-19 pandemic.

ELI’s work in Kazakhstan is a continuation of over a decade of work supporting climate change adaptation law, including projects on adaptation in coastal areas and managing biodiversity in a changing climate.

Helping citizens to review impact of river sediment diversions

ELI’s Gulf of Mexico team works to advance the recovery, restoration, and ecological resilience of the region following the 2010 Deepwater Horizon oil spill. The Institute increases public participation in multiple restoration planning processes, including by supporting local and regional organizations, tracking and reporting on restoration funding, and helping communities understand how to participate in public comment processes.

Recently, ELI has helped local partners engage citizens in processes related to the proposed Mid-Barataria Sediment Diversion, for which a draft environmental impact statement and draft restoration plan were released in March.

Sediment diversions are designed to reintroduce natural delta processes and build landmass. Due in part to decades of building levees and flood control structures on the Mississippi River, the delta and Louisiana coastline have lost thousands of square miles of land. Through gated structures in the levee system, sediment diversions reintroduce fresh water and minerals to nearby basins, rebuilding wetlands in the process.

Project proponents say that the Mid-Barataria Sediment Diversion would create or save from erosion as much as 47 square miles of land in the fifty years following construction. Anticipated benefits include improved soil density, wildlife habitat, increased hurricane resilience, and a boon to the economy through job creation and a rise in business sales. However, concerns remain about mitigating potential impacts on the region’s oyster and fishing businesses and on nearby dolphin populations, among other changes to the ecosystem.

ELI has assisted local partners by researching the legal landscape around the proposed project and releasing a fact sheet on public participation opportunities. The fact sheet explains how citizens can participate in public comment processes for two laws governing environmental reviews for the project: the National Environmental Policy Act, which requires development of an environmental impact statement, and the Oil Pollution Act, under which the Deepwater Horizon Natural Resource Damage Assessment is proceeding.

The Institute will also co-host a series of online community conversations, joining panelists from Louisiana’s Coastal Protection and Restoration Authority and the Louisiana Trustee Implementation Group, to complement formal public meetings.

Site aids water quality programs in engaging the public

For the past 13 years, ELI has conducted annual training workshops for state, tribal, territorial, and EPA staff regarding the Clean Water Act Section 303(d) Program, which identifies waters that do not meet standards and implements plans to restore and protect them. These workshops, supported by EPA, have prompted a wide variety of endeavors to further assist the program.

One of those endeavors has been a five-year cooperative agreement between ELI and EPA to develop a series of compendia of practices across the country for implementing various aspects of the 303(d) program. These resources facilitate knowledge-sharing across jurisdictions and generate more innovation.

Since 2016, ELI has published the “Compendium of Water Quality Restoration Approaches” and the “Compendium of State Approaches to Protection.”

In February 2021, ELI released its third installment, Approaches to Clean Water Communication, a collection of methods for communicating about water quality with the public and other less-technical audiences. The compendium helps programs strengthen engagement with the public, a key goal of water quality improvement programs.

With the assistance of a planning group composed of state, tribal, and EPA staff, ELI collected examples of communication methods through a questionnaire completed by water quality program staff from 44 states, 9 tribes, 4 territories, and the District of Columbia. This information was then distilled into a user-friendly website hosted by ELI.

The resource covers a wide range of communication methods, including websites, maps, social media, and videos. One section of the compendium assembles a library of “Story Maps” and “dashboards,” which are interactive digital tools used by water quality programs to display multimedia elements such as pictures and maps alongside text. The site organizes different types of communication methods in the most easily accessible fashion. For example, visual products such as signs and posters are on an easy-to-read Story Map page.

The compendium also provides summaries for key aspects of effective communication, such as presentations for public comment processes, ways to collect metrics and tools for measuring success, and translating products into multiple languages. As a free database, the compendium may be especially useful to smaller jurisdictions or programs with limited resources dedicated to communications.

Helping Kazakhstan Craft Its Environmental Code.

ELI - and the Greatest Treaty Ever
Author
Stephen O. Andersen - Environmental Law Institute (former)
David Doniger - Environmental Law Institute (former)
Alan S. Miller - Environmental Law Institute (former)
Durwood Zaelke - Environmental Law Institute (former)
Environmental Law Institute (former)
Environmental Law Institute (former)
Environmental Law Institute (former)
Environmental Law Institute (former)
Current Issue
Issue
2
Durwood Zaelke, Stephen Andersen, David Doniger, and Alan Miller

According to Kofi Annan, it is “perhaps the single most successful international agreement to date.” The then UN secretary general was describing the Montreal Protocol on Substances That Deplete the Ozone Layer, a treaty that has only gained in force and effectiveness in the nearly two decades since Annan’s proclamation. The accord has successfully protected life on Earth threatened by chemical releases that destroy the shield protecting our planet from dangerous ultraviolet radiation.

While less well known, at the same time the Montreal Protocol also has managed to avoid an enormous amount of global warming — some ozone-depleting substances like CFCs and HCFCs are also potent greenhouse gases. Indeed, when The Economist ranked the most effective strategies or events that resulted in the cutting of climate emissions, including the fall of the Soviet Union, China’s one-child policy, and the Kyoto accord on reducing greenhouse gases, the Montreal Protocol came out on top — achieving almost as much mitigation as all the other strategies combined. And the interesting news for readers of this magazine is the contributions of former ELI staff and board members who collectively helped make an aspirational treaty into an effective instrument for planetary protection.

In this article four of those former staff — Stephen O. Andersen, David Doniger, Alan S. Miller, and Durwood Zaelke — describe how important our time at ELI was, at a formative period in our careers. The Institute was the incubator of what later became a global network of collaborators in ozone and climate protection, and the beginning of friendships and professional relationships that have endured for decades.

The Montreal Protocol story started when Mario Molina and F. Sherwood Rowland sounded the alarm in their seminal 1974 article in Nature. They warned that CFCs could destroy stratospheric ozone through a series of catalytic reactions. Rowland and Molina became scientist-activists and publicly urged a halt in CFCs for non-essential uses. Consumers in North America and Scandinavia boycotted aerosol cosmetic and convenience products, which helped motivate governments to develop a framework agreement to start addressing the threat. The result was the Vienna Convention to Protect the Ozone Layer, approved in 1985. In 1987 the Montreal Protocol was added, initially with a commitment of a 50 percent phaseout of CFCs in 12 years and a freeze in halon production and consumption. A few years later the protocol mandated a full phaseout of an expanded list of almost one hundred substances. The protocol was strengthened by amendments several times.

Today, every UN member state is a party and is usually in full compliance, more than 99 percent of nearly 100 ozone-depleting substances have been phased out, and the ozone layer is on the path to recovery later in this century. In the United States alone, the phaseout is expected to prevent over 440 million cases of skin cancer, 2.3 million skin cancer deaths, and 63 million cases of cataracts for Americans born in the years 1890–2100, in addition to protecting agricultural and natural ecosystems from damaging radiation.

The injection of climate change issues into ozone-protection regimes traces back to a year after the Nature paper, when Scripps scientist Veerabhadran Ramanathan warned of the potent greenhouse gas properties of these chemicals. Thanks to the phaseout of ozone depleters, we know we will also avoid climate forcing equal or greater than the warming from all the carbon dioxide releases to date. Phasing down HFCs, the latest to be banned, alone will avoid up to 0.5°C of warming worldwide by the end of the century. Additional measures to improve air conditioner energy efficiency and reduce nitrous oxide emissions could avoid even more warming.

The 1970s were formative years at ELI, founded at the beginning of the decade at the same time as the nascent field of environmental law and policy. Durwood Zaelke started as one of the first two summer scholars in 1972 during law school, and then re-joined ELI as the acting editor-in-chief of the Environmental Law Reporter. After a stint with a big law firm in Los Angeles, Zaelke was invited to return to the Institute in 1975 to work on energy conservation law and policy. Alan Miller initially joined ELI in 1973 as a summer legal intern and returned as a staff member in 1974 to work on the new Clean Water Act and legal barriers to solar energy. Stephen O. Andersen joined in 1976 for a series of five books on energy conservation, and later a study of the Vermont Land Gains Tax following two years’ supporting the Sierra Club Legal Defense Fund. David Doniger joined ELI in 1978 before he was hired by NRDC, where he made his name.

For each of us, the introduction to environmental law and policy at ELI had lasting influence. The field was still in its infancy; the Environmental Protection Agency and Council on Environmental Quality had only been created at the beginning of the decade, and the recently passed National Environmental Policy Act, Clean Air Act, and Clean Water Act were all sources of new regulations and judicial interpretations. Fred Anderson, ELI’s first president, was a legendary mentor, starting every morning with a roundtable discussion of environmental news and strategy. The Institute gave credibility to the idea that a career could be built around resolving environmental issues, as the four of us would prove after leaving. Ozone wasn’t on the ELI agenda in those early years, but the collaborative atmosphere for research we developed there was a model for what each of us was to later do in other capacities in reducing dangerous emissions.

Andersen was first to encounter ozone-depletion science. Prior to joining ELI, as a graduate student at UC Berkeley in 1974, he assisted a federal study assessing climate impacts and depletion of stratospheric ozone from the proposed supersonic transport. The project was canceled in part because of those concerns.

Miller was an early advocate for action to regulate ozone-depleting compounds, starting in 1978 at NRDC by producing a background report for the second international meeting on stratospheric ozone later that year in Germany. He also worked with Rowland on congressional testimony and petitioned EPA to expand the ban on aerosol uses of CFCs to other applications. Upon joining the new World Resources Institute in 1985, he co-authored “The Sky Is the Limit: Strategies for Protecting the Ozone Layer.” From 1989 to 1997 he directed a center on global environmental issues at the University of Maryland that included multiple projects to address ozone depletion, including organizing with Anderson a conference on ozone and climate protection supported by NATO and multiple defense ministries.

Doniger’s work on ozone protection began at NRDC in 1984, when he filed suit against EPA for action against non-aerosol uses of CFCs. The result was an agreed-upon action plan. Ever since, Doniger has led the council’s ozone-protection program, including support for the protocol and subsequent amendments. NRDC’s concept of a 10-year global phaseout was adopted in 1990.

In 2006, Andersen sought ways to accelerate the phaseout of HCFCs. EPA and the Department of State suggested it was more appropriate for an NGO to lead such an effort, so he recruited Zaelke as the point man. Zaelke agreed — despite opposition from some NGOs who thought it a distraction from ongoing efforts to negotiate a climate protocol. He succeeded with the 2007 adjustment to speed the phaseout of HCFCs, providing climate benefits that were three to five times more than the initial commitments of the Kyoto Protocol. The Environmental Forum profiled Zaelke after that victory, noting that he “formed a North-South coalition that recently succeeded in broadening the Montreal Protocol to explicitly address global warming.”

Andersen, Doniger, and Zaelke also cooperated in petitioning EPA to prohibit HFC alternatives once necessary for fast phaseout of the worst substances but no longer needed because of new, superior technology. EPA agreed with the petition but was overruled when two fluorocarbon companies successfully appealed to the D.C. Circuit. The ELI trio, with the support of the suppliers and customers of HFCs, fought back and helped pass the Kigali Amendment to the protocol, bringing in HFCs to the ban list. They also promoted the American Innovation and Manufacturing Act passed in the waning days of the Trump administration to mandate HFC reductions. Shortly after taking office the Biden administration announced it would submit the Kigali Amendment to the Senate for ratification.

Andersen joined EPA to work on ozone protection in 1986 and rapidly rose to prominence as deputy director of the Stratospheric Protection Division. He also served as EPA liaison to the Department of Defense on ozone and climate. He then became director of strategic climate projects until he left the agency in 2009 to join Zaelke at the Institute for Governance and Sustainable Development. He founded and co-chaired the Montreal Protocol’s Technology and Economic Assessment Panel, bringing together hundreds of experts from diverse backgrounds, including industry, to identify substitutes for ozone-depleting compounds and the basis for subsequent agreements on accelerated phaseouts. His emphasis on voluntary, collaborative approaches enabled agreements with Soviet authorities on an ozone-mapping satellite, partnerships with the Defense Department on environmental measures, and a series of military conferences on eliminating ozone-depleting greenhouse gases. Prior to leaving EPA, Anderson was awarded the Service to America Career Achievement Medal, the nations’ highest award for public service, as well as awards from the governments of Iraq, Japan, Thailand, Vietnam, and the former USSR, the only non-Soviet citizen to win the award.

Zaelke’s impact on international environmental law began in 1989 when he co-founded the Center for International Environmental Law, as well as the International and Comparative Law Program at American University’s Washington College of Law. He also co-authored the law school textbook International Environmental Law & Policy. In 2003, he founded IGSD to focus on fast-action climate mitigation, including phasing out short-lived climate pollutants to reduce warming in the next two decades, at the same time co-founding a similar program at the Bren School of Environmental Science & Management at UC Santa Barbara. Zaelke has co-authored numerous articles on short-lived climate pollutants with some of the leading scientists in the world. With Molina, his calculation that phasing down HFCs could avoid up to 0.5°C of warming by the end of the century became a key part of the strategy by President Obama and Secretary of State John Kerry to produce a global consensus on the Kigali Amendment. Zaelke and Andersen earned UN and EPA awards for their diplomatic and scientific leadership in support of the amendment.

While their careers took different directions, the four ELI alumni have repeatedly come together. In the 1990s, Miller, Andersen, and Zaelke collaborated on several projects. In addition to their successful efforts in support of the Kigali Amendment, Andersen and Zaelke co-authored Industry Genius: Inventions and People Protecting the Climate and Fragile Ozone Layer, published in 2003, as well as numerous articles. Miller and Doniger reunited to propose strategies for linking HFC reductions with measures to improve energy efficiency in air conditioning, a rapidly growing source of demand for power and carbon emissions globally. In 2020, Zaelke and Molina co-chaired the definitive assessment of the combined climate benefits of improving energy efficiency of cooling equipment during the phasedown of HFCs, with contributions from Andersen and Miller. And this year Miller, Andersen, and Zaelke joined forces again to co-author the book Cut the Super Climate Pollutants Now! making the case for urgent actions.

ELI is globally recognized for its influential publications, conferences, and projects, but as early insiders we would argue that its influence has been magnified many times over by the attraction, incubation, and networking of its management, staff, and members across the broad field of environmental protection. TEF

TESTIMONY The Environmental Law Institute has been the incubator for staff who went on to make extraordinary contributions to ozone-layer protection and climate change leadership. In this article, four alumni tell their story.

With Compliance Built In
Author
Cynthia Giles - Harvard Environmental and Energy Law Program
Harvard Environmental and Energy Law Program
Current Issue
Issue
2
With Compliance Built In

Nearly everyone involved in environmental regulations believes that compliance with environmental rules is pretty good and that it is enforcement’s job to take care of the rest. You hear this all the time — from regulators, companies, legislators, academics, and environmental advocates.

Both assumptions, that compliance overall is strong, and the work of ensuring compliance should be left to enforcement, are wrong. The data reveal that the rate of serious noncompliance — not just any noncompliance, but violations EPA defines as the most important — is typically 25 percent or more, according to the agency’s data of self-reported and government-identified violations. For many important rules with big health consequences, the serious noncompliance rates for large facilities are 50 percent to 70 percent or even higher. And those are just the ones we know about; for many rules EPA has no idea what the rates of noncompliance are because the regulations don’t include any way to figure that out.

We have also learned that the most important driver of compliance isn’t enforcement, but the design of the regulation. If a rule is structured to set compliance as the default, it can get impressive on-the-ground results without the need for much enforcement. Rules that instead include many opportunities to evade, obfuscate, or ignore will have dismal performance records that no amount of enforcement will ever fix. Robust enforcement is absolutely necessary for any strong compliance program, but enforcement alone will never close the compliance gap created by a poorly designed rule.

Next Generation Compliance, which I launched at EPA during the Obama administration, is a new paradigm for environmental rules. It argues that rules need to be tightly structured to make compliance the path of least resistance. Next Gen rule design acknowledges that in the messy real world where we actually live, equipment fails, people make mistakes, multiple priorities compete for attention and funding, and companies make close — and sometimes nowhere near close — calls in their own favor. And sometimes they just cheat. There is a mountain of evidence that rules only work if they find a way to align private incentives with the public good. These essential truths are the difference between a rule that is great in theory — and one that delivers in real life.

One common misconception about Next Gen is that it is about making rules enforceable. It isn’t. Yes, rules should be enforceable, because that’s a baseline condition that differentiates a rule from good advice. But Next Gen goes way beyond that. It is about creating a structure where the default setting is good compliance — where implementation is strong even if enforcement never comes knocking.

Compliance isn’t a nice-to-have regulatory extra. It’s the part that matters. That’s true for every rule. Standards are fine, but we only get public health benefits from regulations when the regulated companies do what the rules require. When they take steps to control pollution, or conduct the required monitoring, or implement process controls to reduce the risk of catastrophic releases, the standards in the rules translate to real protection. If facilities are doing what they are supposed to do, we have a good chance of achieving clean air and water and reducing our risk of exposure. If they aren’t, we don’t.

Rampant violations have consequences: millions of people living in areas of the country that are not achieving air pollution standards, impaired water quality for half of the nation’s rivers and streams, contaminated drinking water, public exposure to dangerous chemicals, and avoidable environmental catastrophes with health, ecological, and economic damages.

Serious violations aren’t limited to some rules or sectors or company sizes. Widespread noncompliance is the norm across the board: just about every large city has been in consistent and serious violation of Clean Water Act limits on discharge of raw sewage and contaminated stormwater, companies responsible for over 95 percent of the nation’s petroleum refining capacity and almost 50 percent of ethylene oxide manufacturers violated Clean Air Act pollution requirements, over 70 percent of the largest coal-fired power companies violated the obligation to upgrade pollution controls, and over 60 percent of phosphoric acid manufacturing sites were in serious violation of hazardous waste handling requirements.

For many other sectors, the full extent of violation isn’t known, but it doesn’t look good: oil and gas wells with excess emissions of benzene and volatile organic compounds, animal agriculture operations’ compliance with clean water limits on the handling of animal feces that is more than three times the sewage produced by the entire U.S. human population, widespread contamination of surface waters from stormwater runoff, agricultural workers exposed to pesticides through violations of the Worker Protection Standard, noncompliance by small-quantity generators of hazardous waste, and cars and trucks spewing pollution from aftermarket defeat devices. Some of the claimed-to-be-better compliance rates — like drinking water standards and stationary sources of air pollution — are based on data that are demonstrably wrong. For millions of facilities covered by rules about chemical safety, oil spill prevention, asbestos remediation, PCBs, or lead paint handling requirements, EPA has no idea how widespread serious violations are.

The harm from pervasive violations isn’t equally shared. It falls most heavily on already overburdened communities. Contrary to popular myth, it is almost never feasible to remedy through enforcement the ubiquitous violations that result from bad regulatory design. And, as we have recently observed to our dismay, some governments aren’t interested in enforcement anyway. Incorporating compliance drivers in environmental rules is one of the most important things we can do to protect environmental justice communities; it shields them from the harm caused by high rates of violation and is less dependent on the unreliable commitment of regulators.

There is no one-size-fits-all Next Gen strategy for regulations that ensure strong compliance. What works for sophisticated power plant operators isn’t likely to be effective for small and dispersed sources of stormwater runoff. Problems that are measurable and discrete, like emissions from stacks and discharges from pipes, are completely different from tough-to-spot violations of regulations to assure approved chemicals are safe, drinking water is clean, or pesticides are properly applied.

But there are some things we know. Exemptions and exceptions create confusion and off ramps that lead to more violations. Things that aren’t measured produce worse outcomes. The less visible violations are, the more there will be. Standards that require lots of fact-specific determinations or have a big gray zone of applicability provide lots of places to hide, and experience shows companies will use them.

In contrast, simple and clear rules — possible even when the underlying situation is complex — are more likely to be effective. Automatic consequences can work better than requiring government to ferret out problems and impose penalties. Monitoring, measurement, and targeted transparency are the single largest drivers of strong implementation. Innovative use of modern technologies and data analytics hold promise for leap-ahead compliance advances. The standard model in wide use today — creating complex requirements with multiple fact-specific exemptions and exceptions, allowing estimates rather than actual measurement or skipping measurement altogether, relying on trust rather than verification, and requiring government to find the violators and track them down one at a time — is why serious violations are widespread.

Sometimes Next Gen ideas can greatly improve outcomes without changing the overall regulatory approach. But sometimes a Next Gen analysis will make it obvious that the preferred regulatory strategy cannot work. In these cases, there is no plug-in solution; the near certainty of implementation collapse means that regulators have to find another way.

Both roles of Next Gen are illustrated in the following two examples for climate change: methane regulation of oil and gas, where Next Gen ideas could help fix big implementation problems, and energy efficiency as part of a clean energy standard, which Next Gen shows is likely to end up undermining the push for carbon reductions.

Methane released from oil and gas production is a huge source of climate-forcing emissions. Methane in the atmosphere traps over 80 times as much heat as carbon dioxide over its first 20 years, so it packs a big climate punch in the near term. The largest source of anthropogenic methane in the United States is fossil fuel production and its transportation, so any climate strategy needs to control those releases. The Obama EPA promulgated methane rules, the Trump EPA repealed them, and the Biden EPA is now set to move out quickly to address this troublesome problem.

Methane, the main component of natural gas, is brought to the surface during oil and gas production. The gas can vent into the air at the wellhead. It can be released by malfunctioning flares. It can leak from storage tanks, valves, and hatches left open. And it does; the amount of wasted methane released from oil and gas production is depressingly large.

The good news is that we know what to do to dramatically reduce methane releases and cut back wasteful flaring. The technology is available and in use today. The costs are reasonable. As climate challenges go, this is one of the easier ones.

While the technological solutions are comparatively simple, the compliance challenges are not. Oil and gas has all the indicia of a sector where compliance with rules to limit emissions is likely to be bad. There are over a million wells in the country, often in out of the way places. Once a well is completed there are no people routinely on site to keep an eye on failing or leaking equipment. Methane is invisible, so leaks can’t be spotted without specialized equipment. By far the biggest share of leaking methane comes from a comparatively small number of sites: at any given moment 90 percent of the emissions come from just 10 percent of emitters. That concentration of super emitters might normally make the compliance job easier, but not in this case ­— the worst emitters vary over time and are unpredictable. That’s the Next Gen nightmare scenario: huge numbers of sources in out of the way places, with violations that are unpredictable and hard to find. Violations are already common at oil and gas wells. That will get much worse as requirements for methane control are ramped up.

The mismatch between the scope and scale of the compliance problem and government’s ability to either find or fix violations is all too obvious. A handful of regulators for the millions of potentially violating locations makes the standard assumptions that most will comply, and enforcers can take care of the rest, self-evidently untenable here. In this situation — a gigantic number of potential sources at which emissions are collectively huge but individually sporadic, unpredictable, and hard to spot — how can we ensure robust adoption of important strategies for cutting emissions?

One under-appreciated compliance powerhouse in the regulatory toolbox is simplicity. The more special conditions and fact-specific nuance the rule allows, the greater the opportunity to avoid or delay implementation. Repeated experience shows that compliance is less likely for rules with a wide band of compliance gray. Exempting low-producing wells from methane rules, as the owners of those wells propose, risks the same thing. Apart from the reality that low-producing wells are not for that reason less likely to be serious emitters, regulatory exemptions motivate companies to claim to be on the exempt side of the line. If determining the accuracy of such claims requires effort and investigation, a lot of violations — and their accompanying emissions — will slide under the bar.

Innovation is part of the answer for many complex compliance problems. Robust alternative monitoring strategies for oil and gas are being developed at a fast pace and could well be the answer in the long term. A rule can encourage that by motivating everyone to use them. One strategy that might provide an incentive is shifting the burden of proof. If government — or academic experts or NGOs — can provide credible evidence through remote monitoring that a site is a significant emitter, why shouldn’t the company now have to prove it isn’t? And take immediate action if it is? Nothing will motivate leak control more than knowing that an army of experts are looking.

The more automatic things are, the more likely it is that the desired action will happen. Hatches accidently left open are a big source of emissions; why not require hatches that automatically close? The same idea can work to motivate reliable emissions reporting. If the monitoring equipment is not working or a site visit is missed, how about requiring companies to assume that the results were bad, so firms, not the public, bear the burden of misfires? Penalties can likewise be automatic for key violations. The better job rule writers do of making the rule reliably self-implementing, the better the compliance record will be.

There are a host of other promising and low-cost ways to improve methane rule implementation in the real world. All of these ideas come to the fore once we abandon the fiction that compliance magically occurs because standards are written in a rule, or that rule writers can ignore obvious implementation disasters waiting to happen because compliance is someone else’s job.

The second example, a Next Gen analysis of energy efficiency as a part of a clean energy standard, is on one level discouraging, because it reveals that a popular idea for funding needed energy efficiency investments will lead to greater carbon emissions. But the good news is because we know that in advance, we can make another choice. Next Gen isn’t about saying no, it’s about understanding the strategies that won’t work, so we can design ones that will.

Electricity generation is one of the largest sources of climate-forcing pollution in the United States. Every strategy for tackling climate change depends on converting large portions of the economy to electric power, while reducing emissions from power generation. States have shown the way; renewable portfolio standards have been the motivating force behind a big share of the increase in clean generation. A national standard that pushes in the same direction can be the foundation for achieving President Biden’s drive toward 100 percent clean electricity by 2035.

The great news from a Next Gen perspective is that widespread compliance with the national equivalent of a renewable portfolio standard is readily achievable. We already accurately measure the amount of power generated by every source, there are a discrete and limited number of regulated entities, and they are all sophisticated in measurement and data. This situation presents close to ideal circumstances for regulations that achieve near universal compliance.

But here’s the rub: what else counts as “clean,” and will those alternatives actually achieve the promised emissions reductions? Next Gen doesn’t focus on the ideological sides in these debates. It asks just one question: will it work?

One of the most popular entrants in the clean energy sweepstakes is energy efficiency. It promises reduced demand for power by accomplishing the same thing with less power. It creates clean energy jobs. The issue isn’t the importance of energy efficiency. That’s clear. Energy efficiency is an essential part of our work to cut carbon emissions. We need as much of it as possible as fast as we can get it.

The Next Gen issue is the impact on power generation’s carbon emissions if energy efficiency is included in a clean energy standard or in any other regulatory program intended to reduce carbon in electricity generation. Design features can vary but the basic idea of such programs is limiting fossil-fired power generation to a fixed and declining amount of carbon emissions per unit of power. Utilities are allowed to comply with that limit by purchasing qualifying credits. When those credits are from solar or wind power, for example, we know exactly how much electricity utilities are buying and can be 100 percent confident that it is zero carbon.

If they buy an energy efficiency credit, on the other hand, we actually don’t know how much electricity savings, and therefore carbon reduction, utilities are getting. That’s because the nature of energy efficiency and the structural incentives of efficiency programs make determining how much energy is saved extremely difficult. What we do know is that far less energy is being saved than current estimates predict. That’s why including energy efficiency credits in a clean energy standard results in more carbon. The fossil-fuel-fired power plant emits more actual we-know-it-is-happening carbon — in exchange for the hoped for but most likely far smaller carbon savings promised by energy efficiency. Why is energy efficiency such a wild card in carbon accounting?

First is that the impact of energy efficiency is inherently uncertain. The theory of energy efficiency is that the same activity, like heating or lighting a home, is accomplished using less power. How much energy was saved? That’s calculated by comparing what actually happens to the hypothetical world of what would have happened without the efficiency project. If a utility pays me to add two inches of insulation to my attic, what’s the energy savings benefit? The answer isn’t as simple as my energy use before and after. There are hosts of variables that make the comparison highly uncertain: the weather is different; I might turn up the heat because I have more insulation; maybe I also bought an electric car or an energy sucking TV; maybe I would have put that insulation in anyway, without the incentive payment. Actual energy use can be measured, but the hypothetical alternative universe cannot. Even with unlimited measurement resources and the best of intentions, this is irreducibly complex, and it isn’t possible to be certain.

Second, the evidence suggests that the estimates we use to calculate energy efficiency savings are way off the mark. Just about everyone today uses estimates of the benefits of energy efficiency called deemed savings. Such metrics provide a guide for estimating how much energy is saved from installing, say, weatherization measures. Rigorously designed studies have found that actual energy savings fall substantially short of the deemed estimate, in some cases possibly delivering only 25 percent of the promised savings. As is true in so many programs, careful measurement reveals the sometimes gross error of estimates.

Third is another problem that is ubiquitous in Next Gen analysis: the incentive structure for energy efficiency encourages overclaiming of benefits while making it nearly impossible to figure out the truth. Utilities that get more money for programs with greater energy reductions have a built-in motivation to overstate the value of efficiency projects. And they do; a 2012 in-depth study of California utilities found that actual savings were 30 percent to 40 percent less than had been projected and that utilities were systematically overstating the savings. Nearly every participant in energy efficiency has an incentive to overclaim benefits.

And that’s before we even get to the fraud that is inevitable when implementation occurs at millions of locations, companies can make money by cutting corners, and government has virtually no visibility into what’s actually happening.

All of these factors combine to tell us that an energy efficiency credit is both highly uncertain and very likely to greatly overstate its value. So what? Energy efficiency is good, right? Who cares if we can’t be certain about exactly how much energy it saves? We care because by including efficiency credits in a program to cut carbon from electricity generation we set ourselves up for more carbon. We allow a ton of real we-are-certain carbon from a fossil fuel utility in exchange for less than a ton — possibly a lot less — of efficiency offsets. And the more energy efficiency credits utilities buy, the greater their actual net carbon emissions will be. That’s not what we are trying to do.

Lots of market-type ideas for climate suffer from the same implementation shortcoming: by allowing shaky offset credits that will not achieve the desired results in the real world, they undermine the integrity of the emissions-reduction goal. That doesn’t mean we shouldn’t do these projects; it means that we shouldn’t fund them through offset credits that will end up increasing carbon emissions. Other strategies — like an energy efficiency resource standard to prompt investment — can promote the desired funding without undermining carbon reduction.

Regulations must be designed to produce better results in the real world, which is the only place that counts. Next Gen is particularly essential for climate rules, where we cannot afford to fall substantially short of the goal because of widespread, and entirely predictable, implementation fails. For climate, that can make the difference between we have a chance, or we don’t. There are many exceedingly difficult problems to tackle in climate change; we can’t be fumbling on the comparatively simple ones like cutting climate-forcing emissions from oil and gas operations and electric-power generation. We know there are ways to get to a far better outcome. We just have to decide to use them. TEF

The ideas in this article are drawn from the author’s in-depth series “Next Generation Compliance: Environmental Regulation for the Modern Era” posted on the Harvard EELP website — C.G.

LEAD FEATURE Widespread, serious violations are the norm for most environmental rules. A Next Generation Compliance approach to regulations can help deliver promised benefits — especially for climate rules, where we cannot afford implementation collapse.

International Conspiracy Promotes Sustainability Along With Equity
Author
Bruce Rich - Environmental Law Institute
Environmental Law Institute
Current Issue
Issue
2
Bruce Rich

Last February 13, the Indian government arrested in the southern city of Bengalaru (Bangalore) an apparently highly dangerous young woman. Security forces extradited her immediately to Delhi to appear before a court the very next day. Police alleged that 21-year-old Disha Ravi contributed to a coordinated international conspiracy “to wage economic, social, cultural, and regional war against India.”

Ravi co-founded in 2019 the Indian branch of Greta Thunberg’s climate protest movement, Fridays for Future. Over the past three years Ravi has been involved in campaigns to protect the endangered lion-tailed macaque, delay a dubiously planned hydroelectric dam, and protest the proposed weakening of India’s environmental assessment law.

Early in February Thunberg tweeted her support for hundreds of thousands of Indian farmers protesting since last year recent market liberalization laws for agriculture. The tweet included a toolkit that explained the farmers’ protest, listing various typical civil society nonviolent tactics, such as tweeting, email alerts, pressure on parliamentarians in the home countries of activists to raise questions with the Indian government, etc. Ravi edited several sentences in Thunberg’s draft of the toolkit, to clarify the farmers’ issues. Obviously a dangerous international conspiracy.

India is one of several major democratic countries, such as Turkey and Brazil, where nationalistic strongmen have come to power, all claiming to make their countries great (again). Attacking journalists, suppressing nongovernmental environmental, human rights, and social justice organizations, weakening and non-enforcement of laws ensuring public participation and transparency (such as environmental assessment) are all part of their toolkit.

Ravi was arrested under a British colonial 1870s sedition act (still on the books!) that provides up to life imprisonment for any “words, signs, or representations” that “attempt to incite disaffection toward the government.” She is in good company. The British Raj arrested Gandhi under the same law, which the Mahatma, who was also a lawyer, said “was designed to suppress the liberty of the citizen.”

In summer 2020, Ravi was one of the leaders of a nationwide protest against a draft revision of India’s environmental assessment law. The leading English language daily The Hindu condemned the draft for going “to great lengths to reduce or even remove public participation, and by extension expert opinion,” from the environmental approval process.

The draft includes a list of projects that would no longer require environmental clearance, including coal mining and seismic surveys for oil, methane, and gas on some lands. It limits public participation in reviewing pipeline infrastructure in national parks and wildlife sanctuaries, as well as relaxing environmental rules for roads and highways. During the public comment period three environmental groups — Fridays for Future, Let India Breathe, and There Is No Earth B — helped mobilize over two million protest emails from Indian citizens to the environment ministry. In response the government shut down the web sites of the organizations for weeks and threatened to arrest environmentalists under anti-terrorism laws.

Ravi recounts that her family and many others in India are already suffering from the effects of climate change. The granddaughter of farmers, she sympathizes with their plight. Farmers account for 58 percent of India’s 1.3 billion people. They suffer from increased climate extremes of drought and flooding, as well as from the economic threat of rapid market liberalization of agriculture, which they maintain will dispossess them and benefit large agribusinesses. “Instead of being supported to become self-reliant and prosperous,” Thunberg wrote, “a majority . . . are increasingly being subjected to the control of large corporations and international institutions whose sole focus is profits, and necessarily involves increased exploitation of nature.”

Thunberg’s and Ravi’s message links environmental sustainability, climate action, and broadened political and economic democracy. It appears to be a threat for some nationalistic governments. Yet India needs these activists for a sustainable future. The 2020 Yale-Columbia Environmental Performance Index, which examines 180 nations on environmental health and ecosystem vitality according to 32 performance indicators, ranked India at 168, near the bottom. India’s air quality ranks 179 (only Pakistan’s is lower), and 42 of its major rivers are so contaminated with heavy metals that they threaten human health.

Ravi was released on bail following a deluge of protests from prominent Indian academics and political figures, including diplomats, former finance and environment ministers, and the economist and author Mihir Sharma, who tweeted “Ok now we’re arresting 21-year-old climate activists. Well done India, you big superpower you.”

International Conspiracy Promotes Sustainability Along With Equity.

Decarbonizing the U.S. Economy Has Substantial, Impressive Benefits
Author
Joseph E. Aldy - Harvard Kennedy School
Harvard Kennedy School
Current Issue
Issue
2
Joseph E. Aldy

President Biden has called for reducing net emissions of U.S. greenhouse gases to zero by 2050. The ambitious nature of this goal reflects a growing understanding of the significant risks that climate change poses to our well-being. The good news is that such rapid decarbonization of the economy would deliver dramatic benefits:
the International Monetary Fund estimates that U.S. carbon dioxide emissions last year will cause nearly $200 billion in climate-related damages.

The IMF calculation builds on the pioneering work of William Nordhaus of Yale University, in which he estimated the Social Cost of Carbon as the present-day value of the economic damages of an additional metric ton of carbon dioxide emitted to the atmosphere. Since a single molecule of CO2, once emitted, could reside in the atmosphere for centuries, an economic assessment of this molecule must account for its impacts next year, and the following year, and for hundreds of years into the future. We calculate the SCC using models that integrate the globe’s economic, energy, and climate systems over the long run.

To inform its ambitious climate agenda, the Biden administration recently announced that it would use estimates of the SCC based on the integrated assessment modeling work of the federal government’s technical experts in 2016. The Biden White House also launched an interagency working group to update the SCC over the next year to reflect the latest insights from the research community.

Recent scholarship has advanced our understanding of climate change damage functions, such as how heat waves under a changing climate could increase premature mortality and reduce labor productivity, as well as how changing temperature and precipitation patterns could affect agricultural production.

Economists use a percentage called the discount rate to monetize future benefits in present-day terms. Recent understanding of discount rates — as revealed by behavior in financial markets — suggests that lower rates should be employed, increasing the present value of future benefits. The updated damage functions and the lower discount rates would each result in integrated assessment models producing higher estimates of the SCC.

This metric can play several key roles in public policy. First, some statutes require regulatory agencies to set standards based on explicit consideration of the regulation’s benefits and costs, such as for the energy efficiency of appliances and the fuel economy of cars and light trucks. Indeed, the George W. Bush administration first used a SCC in evaluating regulations in response to a 2008 federal court ruling that remanded a Department of Transportation fuel economy standard because it had initially failed to account for the benefits of reducing carbon dioxide in the setting of the standard.

Second, regulatory agencies must conduct regulatory impact analyses of their major rules. For those regulations that reduce carbon dioxide emissions, the SCC can illustrate how the benefits justify the costs. Public communication of these results can enhance understanding of the serious risks posed by climate change as well as demonstrate that a given regulation represents a good investment on behalf of the American people.

Finally, the SCC can also inform the design of new policies. For example, some economists have advocated for an economy-wide carbon tax that is set equal to the SCC. If new legislation set a carbon tax equal to the interim SCC adopted by the Biden administration of $50 per ton, then it would result in a tax on gasoline of about 45 cents per gallon and a tax of about 5 cents per kilowatt-hour of coal-fired power. Economic models of such a carbon tax suggest this would halve U.S. carbon dioxide emissions economy-wide by 2035.

A federal clean electricity standard could require increasing shares of renewable and other zero-carbon power in the electricity sector. Building on the experience under some states’ renewable portfolio standards, a national clean electricity standard could include an alternative compliance payment that would effectively cap compliance costs. This could provide insurance that the costs of the policy — and hence the increase in utility rates — do not become unexpectedly high if there is insufficient supply of clean power. The SCC could serve as the basis for setting such an alternative compliance payment.

The SCC can also play a role in international diplomacy. It signals to the rest of the world that the United States accounts for the global benefits of its greenhouse gas emission reductions. If other countries reciprocate — each taking actions that also reflect the global benefits of doing so — then the world can make meaningful progress on the goals set forth in the 2015 Paris Agreement limiting future temperature increases.

Decarbonizing the U.S. Economy Has Substantial, Impressive Benefits.

Can the Impossible Burger Lower Municipalities’ Carbon Footprints?
Author
Linda K. Breggin - Environmental Law Institute
Environmental Law Institute
Current Issue
Issue
2
Linda K. Breggin

Thousands of cities have joined initiatives such as the Global Covenant of Mayors for Climate and Energy, pledging to reduce their carbon footprints. Cities are pursuing a range of actions to reach their targets but until recently have largely ignored measures to advance plant-based proteins — despite the conclusion of the Intergovernmental Panel on Climate Change and other experts that reducing meat consumption plays a key role in addressing climate change.

Recently, however, municipalities are paying more attention to the mitigation potential of plant-based proteins — not only in their climate action plans but in other governance tools as well. Measures range from procurement practices to Meatless Monday campaigns to education and outreach.

This trend coincides with the rollout of what the Good Food Institute calls the “next-generation of plant-based meat,” which “looks, cooks, and tastes like conventional meat.” These products, which “biomimic” meat, appeal to most consumers and are being successfully marketed in stores and restaurants.

The climate mitigation payoff can be substantial. Life-cycle assessments find that the Impossible Burger and its competitor the Beyond Burger stack up well against their beef counterpart — with 89 percent less greenhouse gas emissions. Even small adjustments can be significant. According to the Green Cincinnati Plan: “If 10 percent of Cincinnatians ate meat one less day per week . . . [carbon] emissions would be reduced by 75,000 tons per year.”

Cost-savings are also a factor. A Friends of the Earth study finds that the Oakland Unified School District “slashed the carbon footprint of its food service 14 percent by reducing its purchases of animal products by 30 percent and replacing them with plant-based proteins and more fruits and vegetables.” In addition, there are substantial health benefits in eating less meat.

Nevertheless, cities have been slow to promote plant-based proteins in part because of the practical and political challenges associated with convincing residents to change their diets. The global sustainability organization ICLEI USA’s Angie Fyfe notes that it is “really hard to impose restrictions on diets” and flags “the perils of dictating what people can eat,” citing former New York City Mayor Bloomberg’s efforts to ban large sugary drinks. As a result, she observes that cities tend to “look to options that incentivize” rather than mandate changes.

To be sure, it is easier for cities to introduce plant-based proteins in their own operations, such as hospitals and prisons. However, emissions associated with food consumption are not typically considered in calculating a city’s carbon footprint — unless the food is produced within its geographic boundaries. Because meat is not commonly produced in cities, municipalities may be more likely to focus on reducing the emissions they are required to report.

Despite these barriers, cities are moving forward — driven in part by cost savings, public health, and sustainability goals. Among their approaches is exercising the power of the purse. In their Municipal Guide to Climate-Friendly Food Purchasing, FOE and the Responsible Purchasing Network emphasize that procuring less meat in operations is “a triple win for community well-being, local budgets, and the planet”— and can also motivate the private sector to take similar actions.

According to the guide, several municipalities include “climate friendly food procurement” measures in their climate action plans, including Portland and Eugene, and some specifically address reduced meat consumption, such as Santa Monica (15 percent reduction target for meat and dairy purchases) and Carrboro (50 percent target for emissions reductions associated with meat consumption). And cities like Boulder, Portland, San Diego, and Philadelphia provide guidance on offering plant-based meat alternatives at municipal facilities.

Procurement standards for plant-based proteins also can be incorporated into broader sustainable purchasing and healthy food standards. For example, numerous cities follow the Good Food Purchasing Program’s environmental sustainability and animal welfare standards, which include a strategy to promote “plant-forward menus” with smaller portions of animal proteins.

In addition, a large number of municipalities, and in some cases their school districts, are adopting various forms of Meatless Monday campaigns pursuant to proclamations, resolutions, policies, and climate action plans.

Education and outreach initiatives are also common. Iowa City’s climate action plan supports efforts to promote the benefits of a “plant-rich diet” and Carrboro’s plan includes outreach to residents on climate-friendly diets.

Innovative governance measures to advance plant-based proteins, coupled with omnivore-friendly products, may mark an inflection point in addressing a seemingly intractable climate mitigation challenge.

Can the Impossible Burger Lower Municipalities’ Carbon Footprints?

Bioengineering the Future
Author
David Rejeski - Environmental Law Institute
Mary E. Maxon - Lawrence Berkeley National Laboratory
Environmental Law Institute
Lawrence Berkeley National Laboratory
Current Issue
Issue
2
Bioengineering the Future

In 1898, the British chemist William Crookes gave a talk before the British Association for the Advancement of Science entitled simply, “The Wheat Problem.” Crookes is best remembered for his work on vacuum tubes, and lenses that were precursors to today’s sunglasses, so his focus on wheat production probably startled his audience. Especially since his thesis was alarming: wheat was extracting more nitrogen from the soil than we could replenish, which resulted in ever lower yields and “a life and death question for generations to come.”

It took another decade, but in 1908, the German chemist Fritz Haber (later referred to as the “father of chemical warfare”) provided a solution to the wheat problem by demonstrating that ammonia, the main component for nitrogen fertilizers, could be synthesized. The manufacturing of ammonia for fertilizer is one of the great innovations of the 20th century. Some researchers estimate that its introduction in agriculture has since supported over 40 percent of global births.

But, as has been the case for many technological leaps, there were downsides. Today, the synthesis of ammonia accounts for a quarter of the annual greenhouse gas emissions of the entire chemical sector, as well as increasing nitrogen pollution of waterways through agricultural run-off. Other options are being explored — from synthesizing ammonia using plasma to low-temperature electro-catalysis — but the most intriguing solution is biological.

Some plants, mainly legumes like beans, have microbial partners with an amazing capability to extract and “fix” nitrogen directly from the atmosphere for immediate use by plants. What if that genetic function could be transferred directly to plants like corn? And that is exactly what is happening. Hundreds of millions of dollars are being poured into new approaches, and firms like PivotBio are making plants that they hope in the future will be self-fertilizing, addressing both environmental and food security challenges.

Over the last decade over $12 billion has been invested in new biotech startups and existing companies, with around $4 billion put forward in 2018 alone. The pandemic has riveted our attention on health care applications, but as a recent report from McKinsey notes, “More than half of the potential direct economic impact from biological technologies . . . is outside of health care, notably in agriculture and food, materials and energy, and consumer products and services.”

Some of these emerging applications you may have already heard about, or even tasted. Memphis Meats and Mosa Meat are growing beef, pork, chicken, and even duck meat from cultures in the lab, just two of the over 80 companies now working on cultured meat and seafood protein products using a process broadly referred to as cellular agriculture. These approaches are being applied to a broad spectrum of dietary products. Finless Foods, for example, is applying cellular agriculture technologies to grow fish cells in the lab. It isolates cells from fish tissue, feeds the cell cultures with nutrients to grow and multiply, and structures them into seafood products — all in local facilities, which further reduce transportation-related environmental impacts.

As another example, researchers at the Joint BioEnergy Institute, funded by the Department of Energy, have recently developed a plant biomanufacturing platform that was used to synthesize a new-to-nature biopesticide with novel antifungal properties. This suggests that plants can be used to sustainably manufacture molecules not possible with traditional chemical methods.

This all is the tip of the revolution in what is termed engineering biology and signals a shift from chemical to biological synthesis — to a new manufacturing paradigm. An inventory maintained by ELI to track emerging biotech products and applications now contains over 300 examples stretching across almost two dozen categories, from food to fuel to threat detection.

People are beginning to build with biotechnology. The sustainable building materials startup bioMASON injects microorganisms with sand in an aqueous solution to create bricks and other construction materials, a process that is not only faster than the traditional kiln-fired process, but it also releases no carbon because it does not require fuel or heat. Traditional brick making not only emits CO2 and other gases into the atmosphere, but often involves the removal of agriculturally productive topsoil. That can reduce agricultural yields by 60-90 percent. Another innovative and sustainable materials startup, Cruz Foam, uses one of the most abundant natural polymers on Earth, chitin from shrimp shells, to sustainably manufacture packaging materials, automotive parts, and consumer electronics.

Novel solutions to tackle indoor air pollution are in the pipeline. Researchers at the University of Washington have inserted a mammalian gene (CYP2E1) into ivy plants to increase their detoxifying potential. The gene “codes” for an enzyme that breaks down some of the volatile organic compounds found in homes. The researchers estimate that a biofilter made of these genetically modified plants could deliver clear air at rates similar to commercial home filters.

Next-generation biotech firms are exploring new avenues to address old, intractable environmental challenges. A new effort at Allonnia, backed by Gates Ventures and the Battle Memorial Foundation, will search for enzymes or microbes that could tackle the long lasting risks from so-called “forever” chemicals — per- and poly-fluoroalkyl substances found in thousands of nonstick, stain repellent, and waterproof products.

Biotech is starting to provide promising solutions aimed directly at the global carbon cycle that could help address the 37 gigatons of carbon released annually into the atmosphere — creating carbon-neutral or de-carbonization options for a number of economic sectors, such as agriculture, construction, and some forms of transportation — aviation, for example — that are less amenable to the adoption of traditional carbon-neutral strategies. Aviation currently accounts for 2 percent of global carbon emissions. Unfortunately, plane fuel weight restrictions eliminate many of the other carbon-neutral options being considered for the transportation sector, such as electric motors or fuel cells. But researchers at the University of Manchester in England have re-engineered the genome of a bacterium (Halomonas) that grows in seawater to produce next-generation bio-based jet fuels.

Research is also targeting direct interventions in the carbon cycle, by increasing the carbon capture efficiencies of plants and trees. Today, around 120 gigatons of carbon is removed by terrestrial photosynthesis on an annual basis. So even small improvements could have large impacts on carbon removal while simultaneously improving crop yields and food security. Research is underway to redesign photorespiration and CO2 fixation pathways, optimize light reactions during photosynthesis, and transfer carbon-concentration mechanisms from algae and bacteria into other plant chloroplasts.

Biotech is creating new avenues for climate change adaptation — for instance, the engineering of drought- and disease-resistant crops. Researchers at the Innovative Genomics Institute at Berkeley have developed cacao plants engineered to thrive as the climate warms and dries the rain forests where they normally grow the crop. As many as 50 million people worldwide make their living from the industry.

Long term, biology can be a key to creating a circular economy, where decentralized and distributed biomanufacturing systems are designed to use a variety of inputs. These include chemicals from industrial off-gases; syngas generated from municipal solid waste, organic industrial waste, forest slash, and agricultural waste; or reformed biogas. These systems provide a variety of outputs, from fuels to food or vaccines. This kind of production flexibility is one objective of the new BioMADE initiative developed by the Department of Defense and the Engineering Biology Research Consortium. The seven-year award includes $87.5 million in federal funds and is being matched by more than $180 million from non-federal sources, including state governments.

This future rests on the increasing ability to engineer biology to enable what researchers at the firm Zymergen have coined biofacturing. Jason Kelly, the CEO of Ginko Bioworks, predicts, “As we get better at designing biology, we’ll use it to make everything, disrupting sectors that the traditional tech industry hasn’t been able to access.”

Old biotech was messy, expensive, and imprecise. It would often take large companies hundreds of millions of dollars and years to change the properties and behavior of one molecule. No more. To paraphrase Stanford University economist Paul Romer, the new biology is about better recipes, not just more cooking.

Today’s biology goes beyond the “study of complicated things,” as the British evolutionary biologist Richard Dawkins once put it. Over a decade of significant investments by organizations like the National Science Foundation, the Department of Energy, and the Defense Advanced Research Projects Agency have turned biology into what some have termed a Type 2 innovation platform, similar to the Internet, which “consists of technological building blocks that are used as a foundation on top of which a large number of innovators can develop complementary services or products.” Think of today’s biology not as a science, but as a precision-manufacturing platform — digitally interconnected, increasingly automated, flexible, and cost-effective.

These novel biological engineering approaches share one critical characteristic — the ability to run experiments quickly, testing hypotheses, learning, adjusting — what some have termed the Design-Build-Test-Learn cycle. Making things faster has been lauded as the single most important determinant of manufacturing productivity and was historically a critical focus of companies such as IBM (via Continuous Flow Manufacturing), Motorola (Short Cycle Management) and Westinghouse (Operating Profit Through Time and Investment Management). Jack Newman, a co-founder of the biotech firm Amyris, observed that the DBTL cycle “was transformational, allowing the operational translation of fundamental science into stuff.”

These new capabilities have spawned radically new business models, allowing the disaggregation of the historical value chains that have long dominated medical and agricultural biotech. This is happening even at a time when large first-wave biotech firms are tending toward consolidation, bordering on monopolistic aggregation, such as the recent mega-merger of Monsanto and Bayer. But simultaneously, what some term de-verticalization is creating viable business niches in new economic ecosystems, where many new firms work to design the molecules that can be scaled by larger firms downstream in the value chain.

But going to scale remains a large challenge facing the bioengineering community. This will mean moving from a few milligrams of a novel microbe in the lab to kilograms, kilotons — and beyond in the case of commodity products. Going from lab to commercial-scale production will require a bridge, a distributed and sharable infrastructure that can be co-developed with industry. It will need a new workforce with the necessary skills to engineer large-scale, distributed, and flexible production facilities and the ability to build life cycle and sustainability considerations into manufacturing processes and their associated supply chains.

And going to scale with potentially hundreds or thousands of large-capacity bioreactors will bring the new biotechnology face-to-face with the public and media, raising questions about safety, security, and governance. Moving forward, there is an urgent need for regulatory and policy reinvention. There is an old adage in Silicon Valley that innovation requires a combination of “rich people,” “nerds” and “risk taking.” That may not be enough. There are some important ways in which biology differs from other innovation platforms. The most crucial are the regulatory, security, and public perception barriers that may hinder the introduction of new products into the market.

Regardless of these challenges, over a decade of progress and emerging business opportunities have motivated many countries to develop bioeconomy strategies designed to expand their industrial base and accelerate the commercialization of biotech innovations. There are now nearly 60 bioeconomy strategies for nations and for a number of macro-regional areas like the European Union and East Africa. Thousands of people now attend the biennial Global Bioeconomy Summit held in Berlin (virtual this year). The United States was an early leader, developing a government-wide National Bioeconomy Blueprint in 2012 under the Obama administration. It emphasized the role of the biosciences and biotechnology in creating new economic opportunities.

The 2012 Blueprint was the first and for the better part of a decade the only bioeconomy strategy that featured biotechnology as a critical platform technology to drive economic benefits in the biomedical, agricultural, environmental, energy, and industrial sectors. The Blueprint promotes making strategic and non-overlapping research and development investments, facilitating transitions from lab to market, increasing regulatory efficiency, enabling public-private partnerships, and supporting strategic workforce development. In the years that followed the release of the Blueprint, the Obama administration realized a number of outcomes relating to all five of its strategic objectives.

For instance, significant research investment enabled the discovery of CRISPR/Cas9, which became a genome-editing technology that has significantly accelerated the ability to quickly and precisely edit genomes of microbes, plants, and animals. The Department of Agriculture expanded the BioPreferred Program, the federal biobased procurement system that aims to provide market certainty for the growing industry sector. Then in 2015, Executive Order 13693, titled Planning for Federal Sustainability in the Next Decade, required federal agencies to set biobased procurement targets. The Office of Science and Technology Policy convened the Food and Drug Administration, EPA, and USDA to execute the 2017 Update to the Coordinated Framework for the Regulation of Biotechnology, aimed to increase transparency, ensure safety, streamline regulatory processes, and accelerate the translation of bioinventions to market. There was also a successful public-private partnership between LanzaTech and Pacific Northwest National Laboratory that resulted in the development and testing of the first bio-jet fuel, used to power a Virgin Atlantic Airlines flight from Orlando to London. Finally, in addition to launching a technical roadmap in 2019, progress has been made toward the Blueprint’s workforce objective through a public-private partnership known as the Engineering Biology Research Consortium, which established a four-month industry internship program for Ph.D. candidates to help train the next generation workforce for engineering biology.

Since the National Bioeconomy Blueprint was released, a number of additional important advances have occurred. In 2019, the House of Representatives passed legislation, the Engineering Biology Research and Development Act of 2019, with the aim of directing the Office of Science and Technology Policy to implement a national engineering biology research and development program that would coordinate relevant federal agency investments and activities. The Senate followed with the Bioeconomy Research and Development Act of 2020, with a similar aim. Also in 2020, the National Academies for Science, Engineering, and Medicine released a study, “Safeguarding the Bioeconomy,” that articulated — for the first time — the value of the U.S. bioeconomy, which it estimated at $959 billion annually. The report argued that the United States needs a White House-level standing committee of scientists, economists, and national security experts to develop a strategic plan to promote and protect the United States’ biology-based industry.

These actions portend a future wherein a strategic, coordinated federal effort is possible. Toward this end, additional steps are needed. For instance, the Biden administration should consider creating an office to coordinate interactions between the government and businesses, large and small, on bioengineering. It should be a one-stop shop — similar to what the National Nanotechnology Coordinating Office did for the National Nanotechnology Initiative.

To realize a strategic, coordinated U.S. bioeconomy, policymakers will need to advance not only authorization for a national engineering biology research and development program, but also appropriations to fund it. Any appropriations should be linked to regular evaluation of program impacts and proactive anticipation and management of emerging risks to help ensure public confidence in new and novel products and applications. A recent meta-analysis of the national bioeconomy strategies found that, “Only a minority . . . even mention the potential negative consequences of bio-based transformations.”

Significant strategic infrastructure investments are needed. For example, a new constellation of state-of-the-art, networked biomanufacturing facilities, positioned near sources of biomass, could not only maximize the use of renewable resources but also create high-tech jobs in rural areas. Facilities in Iowa, for instance, could use agricultural waste from corn as a feedstock, those in southeastern states could utilize switchgrass, and coastal production plants could take advantage of marine species such as seaweed and various kelp varieties. This biomanufacturing “commons” could also serve to reduce greenhouse gas emissions and the generation of toxic waste as compared to traditional chemical manufacturing. And it would create value from problematic wastes such as forest slash and agricultural residues.

Building on the progress started by the National Bioeconomy Blueprint developed during the Obama administration, the incoming Biden team has a tremendous opportunity for a renewed commitment to the U.S. bioeconomy as an important pillar of its commitment to climate action. Its new “Made in All of America” effort is aimed at revitalizing domestic manufacturing with inclusive policies and environmental stewardship,

Working together with the 117th Congress, the new administration has potential to realize a Clean Manufacturing Act, aimed to mobilize the diverse talent of the American workforce, accelerate sustainable manufacturing innovation, maximize the use of the billion tons of sustainable, renewable biomass the United States has the ability to produce, and significantly reduce negative environmental impacts of manufacturing.

As nearly sixty countries around the world try to refine their bioeconomy strategies to include biotechnology to help reboot economies crippled by the coronavirus pandemic, the United States has little time to waste in developing strategies to keep its leadership position in biomanufacturing. Over a decade ago, Neri Oxnam at MIT’s Media Lab observed that “the biological world is displacing the machine as a general model of design.” That revolution has happened. The future of manufacturing has arrived. TEF

COVER STORY A sustainable, circular economy may depend on solutions coming from life itself. So think of today’s biology not as just a science, but as a precision-manufacturing platform — digitally interconnected, increasingly automated, flexible, and cost-effective.

Imperatives for Action on Climate Change
Author
Christopher K. Carr - C2E2 Strategies LLC
C2E2 Strategies LLC
Current Issue
Issue
2
Parent Article
Christopher K. Carr

Today, green lawyering can often mean doing your best to address climate change issues. Indeed, many scientists and economists view significantly reducing greenhouse gas emissions as essential for the health of both the environment and our nation’s economy. The stated goal is reaching net-zero greenhouse gas emissions by mid-century, or “deep decarbonization.” Lawyers have a key role to play in achieving this objective.

Within our nation’s deeply divided politics, lasting solutions to climate concerns must be found. Much of this durability is likely to require solutions with bipartisan appeal. And a just carbon transition must not disenfranchise Americans, including those in inner cities, Appalachia, and various industrial and rural areas.

Though I work in Washington, D.C., my personal heritage is rust-belt Ohio. The decimation of industrial centers in the 1970s was not pretty to live through, making the fairness of a carbon transition of both societal importance and personal meaning. My perspectives here also come from lawyering in a variety of contexts: co-chairing a large law firm climate change practice, serving as a World Bank senior counsel on carbon finance, chairing the Climate Change and Sustainable Development committee of the American Bar Association, and working with an array of clients and lawyers.

Many carbon transformations can benefit a broad swath of Americans. Attorneys of all kinds should focus on these. The charge for lawyers — and policymakers — is to assess what can be done, learn the tool kit of policy and legal options, and bring about measures that are good for both the environment and economy.

Modelling and analysis of deep decarbonization in the United States point to several basic pillars. These include: deep decarbonization of the electric sector (which is well underway) and increased electrification and use of other lower carbon fuels; improving energy use efficiency; carbon sequestration through enhanced farming and forests and geologic sequestration; and reducing emissions of other, more potent greenhouse gases such as methane and fluorinated compounds. These pillars have been articulated in the federal government’s 2016 Mid-Century Strategy for Deep Decarbonization and other subsequent analyses.

In addressing these pillars, lawyers have many varying opportunities to do good on climate change, often as natural extensions of their practices. Opportunities involve a broad array of legal work. The list includes project development, corporate advice, debt and equity, mergers and acquisitions, tax and securities law, litigation, energy and environmental regulatory issues, trade law, environment-social-governance advice, and various other aspects of advocacy and legal work. And opportunities arise in multiple economic sectors — electric power, transportation, manufacturing, tech, finance, agriculture, and others.

Focusing on what is economic and sustainable allows us to move far and fast. Difficult issues exist, to be sure, but proof of this practical approach is the dramatic carbon reductions in our nation’s power sector that have taken place in the last 15 years, with broad recognition that electricity can be largely decarbonized, accompanied by various clean air benefits. Reframing or distilling climate change challenges into goals such as clean energy can make issues more manageable and likely to be agreed on, pairing metrics of investment returns, carbon reductions and sustainability.

Greed is not good. Doing good and doing well is very good. Market-based carbon regulatory approaches can be highly effective in deploying human and financial capital and technology to beneficial ends. This goes beyond traditional carbon pricing measures, though these can be quite important, to other market-aware approaches. Combined with additional targeted regulatory measures, carbon regulatory tools that take advantage of markets are likely key for our particular nation to durably address climate change and pursue sustainable development. Lawyers and other stakeholders should quickly work together to achieve these goals.

Imperatives for Action on Climate Change.

Society at Cluster of Inflection Points
Author
Stephen R. Dujack - Environmental Law Institute
Akielly Hu - Environmental Law Institute
Environmental Law Institute
Environmental Law Institute
Current Issue
Issue
2

The last 100 years have been the most important in all human history. And any fair reading of U.S. politics in particular shows we are again at a time when an informed society can choose directions.

A century ago, Albert Einstein’s theory of general relativity gained acceptance by other physicists after photographs of a total solar eclipse showed starlight bending as the result of gravity. At the same time, astronomer Edwin Hubble discovered the existence of other galaxies — and found they were moving away from each other, and us, in an expanding universe governed by Einstein’s gravitational force. The Big Bang theory would emerge a few decades later to explain the new astrophysics. For the first time, humanity had a firm foundation for understanding our place in the universe.

Also starting a century ago, physicists were finally able to determine the makeup and behavior of atoms. The new quantum mechanics was ultimately able to explain all of chemistry, the forces governing elementary particles, and the very existence of mass.

But even though scientists have shown how nature works on the large scale and the small, physicists have been frustrated in unifying the force of gravity governing the cosmos with the subatomic forces into a single explainable phenomenon. At the same time, theorists are stymied by astrophysicists’ discovery of dark matter and dark energy, verbal tags for phenomena that baffle scientific explanation. This mystery stuff accounts for 95 percent of the matter in the universe — our galaxies, stars, and planets add up to only 5 percent of existence. Explaining all these problems will require a new physics.

The 1920s saw technology too enter the modern age. Within that decade, radio went from a lab phenomenon to national and global news and entertainment networks and a receiver in every household. Lindbergh flew the Atlantic solo and commercial aviation took off. Movies came out with sound, and households replaced the icebox and gas lamp with electric refrigerators and light bulbs. The automobile industry also exploded with the adoption of the assembly line, providing good employment and cars affordable to workers.

Of course the story of technology progressed markedly in the following century. Today, we all walk around with satellite-linked computers in our pockets, but we also face the dangerous effluvia of progress. This first came to a crisis 50 years ago, when technology put a man on the moon and collective endeavor was fashionable. That inflection point produced the federal pollution statutes and legislation preserving species and natural resources. The story of technological innovation in response to this crisis is important, but we have recently grasped that there are only a few decades left to totally eliminate greenhouse gas emissions. That will require an overhaul of the energy foundations of civilization. Like science, technology too requires a new paradigm.

In the world of U.S. politics, 100 years ago the Progressive Era came to an end and the country withdrew into isolationism, rejecting both the League of Nations and continued immigration. Progressivism had seen the nascence of federal control of the economy, busting trusts and creating the Federal Trade Commission.

But the Supreme Court simultaneously restricted federal regulation of the market in the 1907 decision Lochner v. New York. Its reasoning has been rejected by modern jurists, including the current chief justice, but it and related cases held sway until overturned in 1937, when Franklin D. Roosevelt’s New Deal collided with the Court’s thinking in West Coast Hotel Co. v. Parrish.

Indeed, Roosevelt’s 1933 ascendancy to the White House was the beginnings of a watershed, a rejection of the status quo that, after stocks collapsed and the world entered the Depression, had professed that the market would resolve unemployment and the destruction of businesses. Americans rejected that line of thinking in electing FDR four times.

Roosevelt unleashed a number of measures to control and nurture the economy, including the Securities and Exchange Commission, and protect the citizenry, like Social Security, all viewed as legitimate, needed functions of government. World War II even saw Washington temporarily seize control of whole industries to support the war effort. Republicans to follow went along with federal economic leadership — look at Eisenhower and the Interstate system plus federal funding of education following Sputnik, or Nixon and the beginnings of modern environmental regulation 50 years ago.

Indeed, one can mark the rollout of environmental law in the 1970s as the climax of the New Deal, because the election of Ronald Reagan in 1980 brought in a new storyline. “Government is not the solution to our problems,” he said on inauguration. “Government is the problem.” The philosophy of tax cuts and relaxed regulation he unleashed has held sway for 40 years, even constraining Democrats.

But recent events show that politics too is again at an inflection point. All nations can benefit from U.S. leadership at a time of world emergency because of the pandemic, climate change, biodiversity collapse, and economic decline. Politics’ new paradigm will be informed by what new science tells us and what new technology offers to counter the problems we face. It will also reflect today’s understanding of political and social rights and responsibilities amid the search for security, equity, and justice, and the role of government in providing a necessary means for realization of communitarian desires along with an arena and impetus for progress. A new compact will engage all sectors of society in solving problems we together are well-equipped to address.

Notice & Comment is written by the editor and represents his views.

How a “Space Station” Turned the Tide

When it comes to the climate crisis, some say we can innovate our way out. In his 1989 essay The End of Nature, the writer Bill McKibben mused: “We may well be able to create a world that can support our numbers and our habits, but it will be an artificial world — a space station.”

McKibben may not have been too far off. In 1991, eight brave souls, donning Star Trek-esque blue spacesuits, entered an air-locked glass dome to live in a “space station” version of Earth. Biosphere 2 (modeling after Biosphere 1, Earth), was a two-year experiment located in Oracle, Arizona, containing three thousand species of plants and animals, as well as an artificial rain forest, ocean, savanna, marsh, desert, and agricultural zone.

The idea was dreamed up by a theatrical performance group that wanted to change the world. Edward Bass, heir to a Texas family oil fortune, joined the team and funded the $200 million needed to construct the massive facility. Their goal? Replicate life on Earth from scratch, in case we need to colonize Mars.

If imitation is the sincerest form of flattery, Planet Earth must have felt both honored and incredibly amused. In an article for Dartmouth Alumni Magazine adapted from his book on the experience, Mark Nelson, one of the original inhabitants of Biosphere 2, noted the group faced “healthy starvation.” Struggling to raise enough food, Biosphere 2 members subsisted mostly off of sweet potatoes. As the orange hue in their complexions grew, so did the public’s alarm.

Oxygen supply became a problem, plummeting to levels found at 15,000-feet elevation and inducing altitude sickness. Ecosystems careened off course, with too many carbon dioxide-producing microbes, and too few pollinating hummingbirds and bees. Household cockroaches and ants ran rampant.

The environment wasn’t the only thing off-kilter. The team developed what Nelson describes as “irrational antagonism,” and reported instances of members spitting at each other.

The group survived for two years inside, though not without a bit of help. Oxygen was pumped in, supplies were delivered, and one member was shuttled to a hospital for a finger injury. The experiment eventually dissolved over disagreements about management.

Today, Biosphere 2 serves as a tourism destination and research center. The facility has enabled significant findings on the impacts of climate change on tropical forests and microbial development.

In fact, when it comes to major ocean policies and discoveries, it may be the “failed” Biosphere 2 experiment that we have to thank. Studies conducted in the complex’s artificial ocean helped scientists understand the process of ocean acidification, when an excess of carbon dioxide reduces the availability of calcium carbonate. Without these minerals, the shells and skeletons of calcifying organisms dissolve. Coral reefs disappear, along with the millions of species that live in them.

Skeptics initially disregarded the findings due to Biosphere 2’s poor reputation, but eventually the scientific — and policy — community came around. In 2009, Congress passed the Federal Ocean Acidification Research and Monitoring Act, which created an Interagency Working Group on Ocean Acidification. The Center for Biological Diversity has long campaigned for EPA to address the issue under the Clean Water Act, but policymakers need to figure out how to measure acidification, and reconcile local impacts caused by a global problem.

Nelson proclaims, “Problems humans cause can also be solved by humans.” Whether or not you agree may depend on a close look at the story of Biosphere 2. 

Society at Cluster of Inflection Points.