Moving Past Environmental Proceduralism

If the activists of fifty years ago had known what we know now, what might they have done differently?
April 22nd 2024

This article originally appeared in Asterisk Magazine, Issue 5: “Mistakes.”

1970 was a landmark year for environmental legislation. While the U.S. environmental movement had been building for decades, a series of industrial catastrophes in the late 1960s coupled with growing awareness of the impacts of pollution turned the natural world into a top policy priority. 

As it was not yet a politically polarized issue, Democrats and Republicans alike jockeyed to prove how much they cared. The result was, in the words of environmental law professor Zygmunt J.B. Plater, “a parade of regulatory statutes…the likes of which we will probably never see again.” Congress passed the National Environmental Policy Act in 1969, the (second) Clean Air Act in 1970, the Clean Water Act in 1972, and the Endangered Species Act in 1973. In 1970, Richard Nixon created the Environmental Protection Agency by executive order. This raft of policies profoundly reshaped our society, and specific pieces contributed to the restoration of the natural world.

But what activists and legislators at the time saw as the most important of these new laws contributed relatively little to the way America tackled the greatest environmental challenges of the last half century. In many of the most notable successes, like cleaning up the pesticide DDT or fixing the hole in the ozone layer, what moved the needle were “substantive” standards, which mandated specific outcomes. By contrast, many of the regulatory statutes of the late 60s were “procedural” laws, requiring agencies to follow specific steps before authorizing activities. 

Today, those procedural laws make it much harder to build the new infrastructure needed to avert climate change and decarbonize. The laws created by the environmental movement now harm the environment. So how did we get here? What could the environmentalists of the 60s and 70s have done differently? And what does that tell us about environmental regulation going forward?

How we got here

The turn-of-the-century conservationists largely accepted a distinction between “sacred” and “profane” lands. Inspired by figures like Henry David Thoreau, who wrote that “In Wildness is the preservation of the World,” the movement aimed to keep certain lands wild in perpetuity, while encouraging industrial development in the urban core.

But in the post-war world, heavy industry and conservation began to appear fundamentally opposed. Use of pesticides greatly expanded. Smog billowed out of cities and covered vast swathes of the country, both sacred and profane. Silent Spring, published by Rachel Carson in 1962, drew attention to the negative effects of DDT, showing that the impact of toxic chemicals could ripple across the entire food chain. Carson’s book became an unexpected hit, selling more than a million copies in just two years. 

Where the original conservationists were motivated to protect wilderness far from civilization, the new environmentalists saw even urban landscapes being tarnished. David Brower, the prominent environmentalist and executive director of the Sierra Club from 1952–1969, was radicalized by changes to the surroundings of the San Francisco Bay area, where he grew up. “We built here for the view of San Francisco Bay and its amazing setting,” he said in 1956. “But today there is no beautiful view; there is hideous smog, a sea of it around us. ‘It can’t happen here,’ we were saying just three years ago. Well, here it is.” John Hamilton Adams, the first director of the Natural Resources Defense Council (NRDC), left his role as Assistant U.S. Attorney for the Southern District of New York after watching raw sewage float down the Hudson River. 

In 1960, a Gallup poll showed that just 1% of Americans saw “pollution/ecology” as an important problem. By 1970, 25% did. Rising environmental concern merged with a national climate of protest and activism. But although Silent Spring drew massive attention to environmental issues, newly minted activists struggled to change laws. Five years after the book’s publication, DDT remained legal in every state in the U.S. — so activists tried a different approach.

In 1967, attorney Victor Yannacone and his wife sued the Suffolk County Mosquito Control Commission for polluting Lake Yaphank in New York with DDT. Although this kind of suit was considered “just this side of bomb throwing” in polite circles at the time, Yannacone called litigation “the only way to focus the attention of our legislators on the basic problems of human existence.” He ultimately lost in court, but Yannacone secured a temporary injunction to stop the Mosquito Control Commission from using DDT, and the commission later decided to end its use. Legal pressure could bring results. 

Yannacone’s strategy to “sue the bastards” became a rallying cry among activists. Yannacone would go on to co-found the Environmental Defense Fund (EDF), which was quickly joined by the NRDC and the Sierra Club as environmental organizations focused on winning battles in the courts. But winning cases proved difficult. It was often hard to find litigants who had suffered specific harms and had the evidence to prove it.

Meanwhile, major environmental incidents kept piling up. In 1969, the Cuyahoga river caught fire in Cleveland, and a blow-out on an offshore platform caused a huge oil spill off the coast of Santa Barbara, galvanizing the public with images of burning rivers and oil-covered seabirds. Congress and President Nixon faced intense pressure to do something. When Senator Gaylord Nelson had introduced a bill to outlaw the use of DDT in 1965, he failed to find a single co-sponsor. By 1970, 8,000 environmental bills were introduced in a single congressional session. The most significant of these new laws was actually passed the previous year: the National Environmental Policy Act, often referred to today as the “Magna Carta” of environmental law.

NEPA’s stated goal was to “encourage productive and enjoyable harmony between man and his environment, to promote efforts which will prevent or eliminate damage to the environment and biosphere and stimulate the health and welfare of man; [and] to enrich the understanding of the ecological systems and natural resources important to the Nation.” This sweeping language was found too vague to enforce by the courts, and was ignored by federal agencies.

But a last-minute addition to the bill gave environmental activists an unexpected boon. In the “environmental impact statement” provision, there was a strict procedural requirement that federal agencies consider the effects on the environment of any major action and produce a “detailed statement” of the likely effects. The requirement wouldn’t itself stop agencies from acting, merely delay them until they completed their detailed statement. This requirement was added to NEPA at the suggestion of Lynton Caldwell, an advisor to the bill’s sponsor, Henry “Scoop” Jackson. Caldwell believed that without some “action forcing mechanism,” the high-minded ideals that NEPA espoused would amount to little.

This requirement originally received little notice. It was not covered in any major media publication. In Congress, it received “neither debate, nor opposition, nor affirmative endorsement.” Caldwell would later state that “most [members] had never really understood the bill and only agreed to it because it was from Jackson; it was about the environment which was a very ‘hot’ issue at the time; and it was almost Christmas and they wanted to get home.” 

Not until several months after NEPA was passed did environmental groups realize what a potent weapon they’d been handed. By forcing disclosure of the potential negative environmental effects of an action, NEPA sparked public opposition to projects that might otherwise have gone unnoticed. Agencies increasingly found it politically difficult to take actions that might harm the environment. Activists were able to force agencies to consider ever more environmental effects, and stopped projects until they did so.

The activist courts of the 1970s expanded NEPA’s remit. In a landmark 1971 decision, Calvert Cliffs Coordinating Committee v. Atomic Energy Commission, the court ruled that NEPA’s procedural requirements “established a strict standard of compliance.” A government agency couldn’t simply produce a statement of environmental effects and then stick it in a filing cabinet. In what became known as the “hard look” doctrine, the statement had to be sufficiently integrated into the decision-making process to satisfy the courts.

The co-founders of the Natural Resources Defense Council would later say that “the importance of NEPA cannot be overstated” and that NEPA was “the core of our work.” Procedural laws like NEPA remain central to modern environmental activism. However, many of the new environmental laws passed in the 1970s were not procedural but substantive: instead of establishing a process that must be followed, they regulate what people and organizations are allowed to do. These include the Clean Air Act, which set specific limits on the allowable levels of several airborne contaminants, and the Clean Water Act, which prevents the discharge of pollution into surface waters. Procedural and substantive laws have represent very different approaches to environmental regulation — and in the following decades, they’ve had very different consequences.

The pros and cons of proceduralism

There are practical reasons for procedural laws to dominate environmental regulation. Proceduralism is a flexible tool that can be adapted to circumstances as needed. The uncertain nature of science, and the time it takes to observe potential long-term effects, mean that regulations can take years to author. Meanwhile, industry introduces thousands of new chemicals every year, and constantly changes the formulation of older ones.

By contrast, substantive environmental laws that regulate specific actions or pollutants may have a hard time keeping up as the world changes. When the EPA began to write regulations for 65 toxic water pollutants, the process took 11 years. Afterward, it was forced to write another round of regulations for toxic chemicals that had been introduced in the interim. And while substantive laws are often reactive, addressing problems as they are revealed, procedural laws force agencies to address environmental impacts that could happen in the future.

However, this very flexibility is the weakness of procedural regulation. Because it can be used against anything, laws like NEPA end up being used against everything. Every project faces some opposition, and laws like NEPA empower opponents at the expense of everyone else who would benefit. Because NEPA’s provisions are purely procedural, it can’t be used to stop projects. But it can slow them down (by requiring ever-more detailed environmental impact statements) to the point where it becomes infeasible to pursue them, a strategy on which activists have increasingly relied. The Forest Service estimates some forms of review cost one million dollars. 

The fact that NEPA could be leveraged to block almost any new construction suited the increasingly “anti-growth” wing of the 1960s environmental movement. The 1968 book The Population Bomb, which argued for forced sterilization to limit population growth, was written at the suggestion of David Brower. In the 1970s, activists began to oppose new housing development in California and other states on environmental grounds. The co-founder of Greenpeace International, Rex Weyler, would go on to become an advocate of “degrowth,” a movement dedicated to “reduction in production and consumption” in the name of environmental justice.

In some cases, the consequences of long reviews have been richly ironic. Delays in the NEPA process for the prescribed burning of the Six Rivers National Forest resulted in the wildfire that the prescribed burning was meant to prevent. And as more procedural lawsuits take place, procedural requirements get ever stricter, increasing the permanent tax on new building and government actions.

This procedural burden weighs especially heavily on new players and novel technologies that haven’t had time to work the system. Oil and gas drilling companies, for instance, have been granted several NEPA exceptions that reduce the requirements for things such as drilling exploratory wells. Geothermal energy, on the other hand, does not receive such exceptions, even though it involves drilling wells like those for oil and gas. Procedural requirements, like most regulations, heavily favor the incumbents.

Stopping climate change requires building hundreds of billions of dollars of new infrastructure. Procedural regulation makes that task far more difficult. Going forward, we can draw on lessons from some of the biggest environmental wins of the last fifty years. A broad range of successes have been built on substantive regulations: from the elimination of leaded gasoline to the end of acid rain and the hole in the ozone layer.

Lead poisoning

Multiple studies show a relationship between lead exposure and higher crime rates. Lead poisoning also leads to birth defects and slowed growth and development. After the Flint water crisis, fertility rates decreased by 12% and birth weight fell by 5.4%. Housing remediation in Rhode Island to remove lead paint accounted for roughly half of the total decline in the black-white test score gap over the same period.

Roberts, Jody A. 2005. “Creating Green Chemistry: Discursive Strategies of a Scientific Movement.” PhD diss., Virginia Polytechnic Institute and State University.

One of the EPA’s first major actions was to mandate the phase-out of lead in gasoline in the Clean Air Act of 1970. In 1973, the EPA required refiners to begin producing unleaded gasoline, and leaded gasoline was banned for vehicles beginning with model-year 1975.

The EPA also introduced a unique trading program to help smaller refineries manage the high costs of compliance. Traditional cap-and-trade systems allocate a fixed number of permits tied to emissions, incentivizing producers to find cleaner substitutes. Instead, the EPA allowed refineries to earn lead credits by producing gasoline with lower lead content than the upper threshold permitted. The credits could be sold or banked for future use. This approach incentivized refineries to reduce lead content sooner than required, facilitating a quicker phase-out of leaded gasoline. The program both achieved its environmental goals and proved economically efficient, encouraging the development of cost-saving technologies and demonstrating that a tradable emission rights system was viable.

Simultaneously, the Lead-Based Paint Poisoning Prevention Act was passed in 1971, prohibiting the use of lead-based paint in federal buildings and projects. By 1978, the Consumer Product Safety Commission banned the sale of lead-based paint for use in residences and in products marketed to children. As a result of these policy efforts, the average lead concentration in the blood of children in the U.S. fell by 95% between 1978 and 2016.

Lead concentrations measured in blood samples from children aged one to five years old. Children are particularly vulnerable to lead poisoning and can suffer permanent impacts on the development of the brain and nervous system. Source: Our World in Data

DDT and CFCs

DDT (dichlorodiphenyltrichloroethane) is a synthetic chemical compound that was widely used as an insecticide. It was first synthesized in 1874, but its effectiveness as an insecticide was not discovered until 1939. DDT was extensively used during World War II to control malaria and typhus among civilians and troops. Following the war, it became widely used in agriculture as a pesticide. The chemical compound proved so useful that its discoverer — a Swiss scientist named Paul Müller — was awarded a Nobel Prize in 1948.

But as scientific evidence of DDT’s toxicity to humans and other wildlife continued to mount, Nixon’s EPA began a review of DDT’s use. After a lengthy set of hearings and legal battles, EPA Administrator William Ruckelshaus announced the cancellation of all DDT use in the United States, except for emergency health-related uses and certain other exceptions.

While DDT was a significant local pollutant, the world faced a truly global threat during the ozone depletion crisis in the late 20th century, caused primarily by the release of chlorofluorocarbons (CFCs) and other ozone-depleting substances (ODS). These chemicals, found in products like refrigerants and aerosol sprays, would rise to the stratosphere when released. There, they were broken down by ultraviolet (UV) radiation, releasing chlorine atoms that destroyed ozone molecules. The ozone layer is crucial for life on Earth as it absorbs most of the Sun’s harmful UV radiation.

In 1974, scientists Mario Molina and Sherwood Rowland demonstrated the damaging impact of CFCs on the ozone layer, leading to increased scientific investigation and public concern. The discovery of the Antarctic ozone hole in 1985 served as an alert to the scale of the problem, leading to an unprecedented international response.

The primary global effort to combat ozone depletion was the Montreal Protocol, adopted in 1987. This international treaty aimed to phase out the production and consumption of ODS, including CFCs. It has been signed by 197 countries and is the first treaty in the history of the United Nations to achieve universal ratification. The agreement has undergone several amendments to include new substances and accelerate phase-out schedules. Thanks to the Montreal Protocol, the levels of ODS in the atmosphere have significantly decreased, and the ozone layer is gradually recovering.

Acid rain

In 1980, Canada’s Environment Minister John Roberts called acid rain “the most serious environmental threat to face the North American continent.” Acid rain is a type of precipitation with high levels of sulfuric and nitric acids, resulting from the atmospheric reactions of sulfur dioxide (SO₂) and nitrogen oxides (NOx), which are emitted by burning fossil fuels. These pollutants react with water vapor and other substances in the atmosphere to form acids. When these acidic compounds fall to the ground with rain, snow, or fog, they can damage forests, aquatic ecosystems, and erode buildings and monuments.

Sulfur dioxide (SO2) is an air pollutant formed from the burning of fuels that contain sulphur, such as coal. SO2 is one of the main chemicals that form acid rain. Nitrogen Oxides (NOx) are gasses mainly formed during the burning of fossil fuels. Exposure to NOx gasses can have negative effects on respiratory health. NOx gasses can also lead to the formation of ozone — another air pollutant. Source: Our World in Data

In the U.S., the significant reduction of sulfur dioxide (SO₂) pollution can be largely attributed to the Clean Air Act Amendments of 1990 (CAAA), which introduced the Acid Rain Program, a pioneering market-based cap-and-trade system targeting SO₂ emissions from electric power plants. Title IV of the CAAA set an explicit substantive goal: “reduce the adverse effects of acid deposition through reductions in annual emissions of sulfur dioxide of ten million tons from 1980 emission levels, and… of nitrogen oxides emissions of approximately two million tons from 1980 emission levels.” Under this system, power plants were allocated a certain number of emission allowances. Plants that succeeded in reducing their emissions below their allowance levels could sell their excess allowances to others. This market-based approach created a financial incentive for emission reductions.

Low-cost technological advancements played a crucial role in enabling companies to meet the SO₂ emission cap. The widespread installation of flue-gas desulfurization units (commonly known as scrubbers) in power plant smokestacks effectively removed SO₂ from emissions before their release into the atmosphere. According to the EPA, the most common type of desulfurization units are “wet scrubbers,” which remove greater than 90% of SO₂. This technology, alongside a shift towards the use of low-sulfur coal, significantly contributed to the reduction in emissions. Increasing reliance on natural gas and renewable energy sources also helped reduce sulfur dioxide emissions.

The reduction of nitrogen oxide emissions was also achieved through a combination of legislative actions and technological advancements. The Clean Air Act Amendments of 1970, 1977, and 1990, set stringent national air quality standards and required significant reductions in NOx emissions from power plants, industrial facilities, and vehicles. The NOx Budget Trading Program in the eastern U.S. began in 2003 and employed a cap-and-trade system similar to the one used for sulfur dioxide. This program set a cap on total NOx emissions and allowed entities to trade emission allowances, incentivizing reductions. Other factors also drove down NOx emissions, including the introduction of catalytic converters in cars and trucks in 1975, and the transition to cleaner fuels, like unleaded gasoline and ultra-low sulfur diesel.

Technological innovations to reduce emissions from stationary sources of air pollution have been pivotal in reducing NOx emissions. The adoption of selective catalytic reduction and selective non-catalytic reduction systems in power plants and industrial settings has effectively decreased NOx emissions by chemically transforming them into nitrogen and water. Additionally, the development and use of low-NOx burners and improved combustion techniques have minimized emissions at the source.

According to a 2005 cost-benefit analysis in the Journal of Environmental Management, the U.S. Acid Rain Program’s benefits were valued at $122 billion annually, far outweighing the costs — around $3 billion annually for 2010, less than half of the initial 1990 estimates.

Annual consumption of ozone-depleting substances. Emissions of each gas are given in ODP tons. Source: Our World in Data

Conclusion: reducing PM2.5 and greenhouse gas emissions

We need a new era of environmentalism that learns from the successes and failures of the past. Environmentalists rightly tout triumphs over acid rain, ozone depletion, DDT, and lead exposure. But these wins were not the result of preparing ever longer environmental impact statements for specific projects. They were the product of putting a price on pollution, via cap and trade programs, or outright banning a pollutant when necessary.

We’re already seeing some of these lessons applied to the fight against climate change. According to the World Bank, in 2023, 73 carbon pricing initiatives around the world cover 11.66 gigatons of CO2 equivalent, or 23% of global Greenhouse Gas emissions. The U.S. already has two major cap and trade programs for greenhouse gasses, one in California and a cooperative initiative between a dozen Eastern U.S. states called the Regional Greenhouse Gas Initiative. And even though carbon pricing is politically unpopular at the national level, policymakers in search of revenue-raising policies may come to see it as the least bad option in an era of high inflation and rising interest rates.

Source: Environmental Protection Agency

We need to build on the clean technology investments across the Inflation Reduction Act, the Infrastructure Investment and Jobs Act, and the CHIPS and Science Act. In terms of permitting reform, policymakers can go further than the changes in the Fiscal Responsibility Act earlier this year, which mandated shot clocks and page limits on reviews. A strict time limit on judicial injunctions would provide certainty for project developers that the review process will come to an end at some point. Bad faith opponents seeking to kill a project via a “delay, delay, delay” tactic will be stripped of their power.

These reductions in environmental proceduralism could be paired with increases in substantive environmental standards. In recent years, economists and public health experts have raised their estimates of the costs of local air pollution on human health and productivity. A deal that implements a price on greenhouse gasses and raises the National Ambient Air Quality Standards to reduce fine particular matter (PM2.5) while streamlining the permitting process to build new infrastructure would be a worthy environmental policy regime for the challenges we face today.

There’s no better illustration of the problems of proceduralism than an attempt to build a “green” paper mill (which produced recycled paper without using chlorine, bleach, or other toxic chemicals) in New York City in the 1990s. The project was led by the NRDC, the organization formed in 1970 specifically to defend the environment by filing lawsuits, particularly against violations of NEPA. Its goal was to “show industry that sustainability was compatible with the pursuit of profit.”

But the project ultimately collapsed due in part to the very forces that NRDC had helped create. The leader of the project, Allen Hershkowitz, found himself spending nearly all his time trying to obtain permits for the project, a situation which the NRDC found ironic:

“We had worked hard for years to create strict rules governing things like clean air and clean water, and now we had to jump through all the same regulatory hoops as someone who wanted to open a hazardous waste dump,” NRDC cofounder John Adams noted in his history of the NRDC.

NRDC also found itself opposed by local clean air activists who objected to the construction of a large factory on the grounds that it would increase neighborhood pollution, a charge NRDC found “baseless” and “especially painful.”

In the end, NRDC was unable to navigate the morass of getting funding, community support, permits, and political approval. The final straw was when the NRDC was sued by the NAB Construction Company for $600 million for breach of contract. The NRDC, which had spent its entire existence filing lawsuits to prevent environmentally harmful behavior, now found itself on the other end of a lawsuit, and the project was canceled. In response to the ordeal, the NRDC reflected that “the law alone was better at stopping bad things from happening than at making good things happen.”