Regenerative Agriculture: Good for Soil Health, but Limited Potential to Mitigate Climate Change
EDITOR'S NOTE: Some people also include under “regenerative agriculture” practices primarily aimed at boosting yields (e.g., agroforestry, which integrates trees and shrubs on farmland, and can sequester carbon in soils and vegetation as a co-benefit) and/or practices aimed at regenerating lands that no longer produce food (e.g., reforestation, peatland restoration, riparian buffer zones). For more on the many benefits of agroforestry, see Chapter 13 in Creating a Sustainable Food Future, and for more on natural ecosystem restoration, see Chapters 20-21. This blog post focuses on practices aimed primarily at boosting soil carbon on working agricultural lands. An additional explainer blog is available on the Potential Contribution of Soil Carbon Sequestration on Working Agricultural Lands to Climate Change Mitigation as well as a blog highlighting 6 Ways the US Can Curb Climate Change and Grow More Food.
Agriculture needs to close an 11-gigaton greenhouse gas (GHG) gap between expected emissions in 2050 and those needed to hold global warming below 2oC. Several noteworthy reports have proposed a range of mitigation options. Our World Resources Report: Creating a Sustainable Food Future, issued jointly with the World Bank and the UN, laid out 22 solutions to cut emissions by two-thirds, while still feeding a likely population of 10 billion in 2050. Yet much of the recent limelight for agricultural emissions reductions shines on one option that our report found had limited potential: increasing carbon sequestration in soils through practices broadly referred to as “regenerative agriculture.”
Regenerative agriculture has become the darling of many policymakers, food companies and farmers. Advocates claim a triple win: climate change mitigation, increased profit for farmers and greater resilience to a changing climate. Our view is that the practices grouped as regenerative agriculture can improve soil health and yield some valuable environmental benefits, but are unlikely to achieve large-scale emissions reductions.
Here, we explain the practices people are calling regenerative agriculture, examine their climate change mitigation potential, and evaluate their place among other agriculture mitigation options.
1. What is regenerative agriculture?
Although regenerative agriculture has no universal definition, the term is often used to describe practices aimed at promoting soil health by restoring soil’s organic carbon. The world’s soils store several times the amount carbon as the atmosphere, acting as a natural “carbon sink.” But globally, soil carbon stocks have been declining as a result of factors such as the conversion of native landscapes to croplands and overgrazing. One goal of regenerative practices is to use some of the carbon that plants have absorbed from the atmosphere to help restore soil carbon.
Practices grouped under regenerative agriculture include no-till agriculture — where farmers avoid plowing soils and instead drill seeds into the soil — and use of cover crops, which are plants grown to cover the soil after farmers harvest the main crop. Other practices include diverse crop rotations, such as planting three or more crops in rotation over several years, and rotating crops with livestock grazing. Sometimes any practice that involves reduced fertilizer or pesticide use is considered regenerative agriculture.
2. Do regenerative agriculture practices generate environmental benefits?
There is broad agreement that most regenerative agriculture practices are good for soil health and have other environmental benefits. No-till reduces soil erosion and encourages water to infiltrate soils (although it can require greater use of herbicides). Cover crops do the same, and can also reduce water pollution. Diverse crop rotations can lower pesticide use. And good grazing practices — such as moving cattle around frequently, adding legumes or fertilizers, and avoiding overgrazing — can increase vegetation and protect water sources.
3. What about the potential of regenerative agriculture practices to mitigate climate change?
The thinking behind regenerative practices as a climate mitigation strategy is to remove carbon dioxide out of the air by storing it as organic carbon in soils. While practices like adding manure can increase soil carbon, the feasibility of scaling such practices over large areas to substantially increase soil carbon and mitigate climate change is much less clear. Our own report analyzing mitigation options in the food and land sector concluded that the practical potential was at best modest due to several challenges, including:
- Uncertain benefits: There’s limited scientific understanding of what keeps soil carbon sequestered, and, as a result, uncertainty about whether regenerative practices actually sequester additional carbon. For example, there is an active scientific debate about whether no-till, the primary practice relied upon by proponents of regenerative agriculture to generate climate benefits, actually increases soil carbon when properly measured. Studies on grazing land found that the effects of grazing on soil carbon are complex, site-specific and hard to predict, although grazing practices that increase the amount of grass growing generally sequester some carbon. Even putting aside these uncertainties, maintaining enhanced soil carbon levels is practically challenging. For example, in the United States, the vast majority of farmers who practice no-till also plow up their soils at least every few years, reversing most, if not all, of any short-term carbon storage benefit.
- Faulty carbon accounting: Carbon must be added to soils to increase soil carbon, and this carbon must ultimately come from plants that absorb carbon from the air. But if the direct sources of carbon would have otherwise been stored or used elsewhere, it simply moves carbon from one place to another, achieving no additional reduction in emissions. Calculations of carbon benefits from soil carbon sequestration on a specific farm often omit off-farm effects that produce emissions elsewhere, as illustrated in the graphic. For example, manure is filled with the carbon and nutrients absorbed originally by plants and eaten by animals. For that reason, adding manure to a field increases soil carbon where it is applied. But because there is a limited supply of manure in the world, using it in one place almost always means taking it from elsewhere, so no additional carbon is added to the world’s soils overall. The global supply of crop residues is also limited. If residues used as animal feed (which is common in Africa) are used to increase soil carbon on a farm, farmers may need to expand cropland into forests or grasslands to replace the animal feed, releasing carbon stored in these natural ecosystems’ soils and plants. Converting cropland to grazing can build soil carbon, and might be advisable where cropping is marginal. But if the crops replaced by grazing are ultimately grown elsewhere by cutting down forests or grasslands, it can result in a net increase in greenhouse gas emissions. This same need to replace food elsewhere exists if regenerative practices reduce the amount of livestock or crops produced on a given land area (and studies of many practices so far have shown mixed yield effects). The failure to count these off-farm effects especially matters if soil carbon benefits are claimed as carbon offsets.
- The need for large quantities of nitrogen: Another limitation on storing soil carbon is the need for nitrogen, which usually comes in the form of fertilizer. For carbon to remain in soils for more than a short time, scientists generally agree that it must be converted into microbial organic matter. This requires around one ton of nitrogen for every 12 tons of carbon sequestered (in addition to the nitrogen used and removed by the growth). Applying more nitrogen to agricultural lands to increase soil carbon would be problematic, whether added through fertilizer or nitrogen-fixing legumes. Only some of the added nitrogen would likely be captured and turned into soil carbon; much would escape into waterways, where it would fuel algal growth and water pollution. Some would be converted by soils into nitrous oxide, a powerful greenhouse gas. It’s true that in many parts of the world, farmers already apply more nitrogen than the crop actually uses, but they do so to compensate for the fact that some of the applied nitrogen escapes into the air and water. To use more of this nitrogen to build soil carbon, farmers must find ways to prevent that nitrogen from escaping. Planting cover crops is one way, since their roots capture nitrogen that would otherwise leach out, creating some potential to build stable soil carbon. Yet overall, the need for nitrogen poses a major but often overlooked limitation to soil carbon gains.
- Scaling across millions of acres: According to a recent study, the use of cover crops across 85% of annually planted U.S. cropland could sequester around 100 million tons of carbon dioxide per year. Such an unprecedented achievement would offset about 18% of U.S. agricultural production emissions and 1.5% of total U.S. emissions. However, while the use of cover crops has been expanding in the United States, they still occupy less than 4% of U.S. cropland and face barriers to wider adoption, such as costs and limited time to establish them before winter begins. Cover crops should be actively promoted given their potential to improve soil health, reduce nitrogen pollution and create climate benefits, but their realistic potential for soil carbon gains is uncertain at this time.
Given these challenges (described in more detail in our report), we consider large estimates of climate change mitigation potential for agricultural soil carbon to be unrealistic. Many published claims of massive potential do not address these scientific and practical challenges. One recent paper, for example, estimated that if all the world’s agricultural land sequestered 0.5 tons of carbon per hectare per year, it would achieve about 2.5 gigatons of carbon storage per year, offsetting 20% of annual global greenhouse gas emissions. Another claims it is possible to draw down a trillion tons of carbon dioxide into agricultural soils – an amount that exceeds the entire amount of soil carbon lost since the dawn of agriculture. Based on current evidence, these levels of emissions-reduction potential aren’t plausible.
However, soils naturally store massive amounts of carbon, and the scientific understanding behind this process is still emerging. If future research finds new ways to sequester carbon or dramatically changes our understanding of existing approaches, our conclusions would change. A WRI research paper details some actions policymakers can take now to accelerate such research.
4. What can we do now to mitigate climate change in the food and agriculture sector?
Fortunately, there are many other ways to rein in agricultural greenhouse gas emissions. WRI identified 22 solutions organized into a five-course menu: (1) reduce growth in demand for food and other agricultural products; (2) increase food production without expanding agricultural land; (3) protect and restore natural ecosystems; (4) sustainably increase fish supply; and (5) reduce greenhouse gas emissions from agricultural production (with a limited role for soil carbon sequestration and a much larger role for reducing emissions from cattle, manure, fertilizers, rice cultivation and energy use).
Many of these solutions are ready for scaling and come with co-benefits for farmers, consumers, food security and the environment. As governments seek to build back economies and food companies chart ambitious climate strategies, we recommend decision-makers select from this broader menu to close the agricultural emissions gap and contribute to a sustainable food future.