A Public Sphere for Poetry, Politics, and Nature: over 400,000 monthly users
A great deal of recent public and media attention has been focused not on gas itself, but on the mechanism increasingly used to extract it. Hydraulic fracturing — better known as fracking — is a technique that uses high-pressure fluids to “fracture” and extract gas from low permeability rocks where it would otherwise be trapped. The technique itself has been around for a long time, but in the last decade, combined with innovations in drilling technology and the high cost of petroleum, it has become a profitable way to produce energy.
The somewhat surprising result of several recent studies (including one by an expert panel from the Council of Canadian Academies on which I served) is that, from a climate-change perspective, fracking probably isn’t much worse than conventional gas extraction. Life-cycle analyses of GHG emissions from the Marcellus and Bakken shales, for example, suggest that emissions are probably slightly but not significantly higher than from conventional gas drilling. A good proportion of these emissions come from well leakage.
It turns out to be surprisingly hard to seal a well tightly. This is widely acknowledged even by industry representatives and shale gas advocates. They call it the problem of “well integrity.” Wells may leak when they are being drilled, during production, and even when abandoned after production has ended. The reason is primarily because the cement used to seal the well may shrink, crack, or simply fail to fill in all the gaps.
Interestingly, there’s little evidence that fracked wells leak more than conventional wells. From a greenhouse gas perspective, the problem with fracking lies in the huge number of wells being drilled. According to the U.S. Energy Information Administration, there were 342,000 gas wells in the United States in 2000; by 2010, there were over 510,000, and nearly all of this increase was driven by shale-gas development — that is, by fracking. This represents a huge increase in the potential pathways for methane leakage directly into the atmosphere. (It also represents a huge increase in potential sources of groundwater contamination, but that’s a subject for another post.)
There have been enormous disagreements among scientists and industry representatives over methane leakage rates, but experts calculate that leakage must be kept below 3% for gas to represent an improvement over coal in electricity generation, and below 1% for gas to improve over diesel and gasoline in transportation. The Environmental Protection Agency (EPA) currently estimates average leakage rates at 1.4%, but quite a few experts dispute that figure. One study published in 2013, based on atmospheric measurements over gas fields in Utah, found leakage rates as high as 6%-11%. The Environmental Defense Fund is currently sponsoring a large, collaborative project involving diverse industry, government, and academic scientists. One part of the study, measuring emissions over Colorado’s most active oil and gas drilling region, found methane emissions almost three times higher than the EPA’s 2012 numbers, corresponding to a well-leakage rate of 2.6%-5.6%.
Some of the differences in leakage estimates reflect differing measurement techniques, some may involve measurement error, and some probably reflect real differences in gas fields and industrial practices. But the range of estimates indicates that the scientific jury is still out. If, in the end, leakage rates prove to be higher than the EPA currently calculates, the promised benefits of gas begin to vaporize. If leakage in storage and distribution is higher than currently estimated — as one ongoing study by my own colleagues at Harvard suggests — then the alleged benefits may evaporate entirely.
And we’re not done yet. There’s one more important pathway to consider when it comes to the release of greenhouse gases into the atmosphere: flaring. In this practice, gas is burned off at the wellhead, sending carbon dioxide into the atmosphere. It’s most commonly done in oil fields. There, natural gas is not a desirable product but a hazardous byproduct that companies flare to avoid gas explosions. (If you fly over the Persian Gulf at night and notice numerous points of light below, those are wellhead fires).
In our report for the Council of Canadian Academies, our panel relied on industry data that suggested flaring rates in gas fields were extremely low, typically less than 2% and “in all probability” less than 0.1%. This would make sense if gas producers were efficient, since they want to sell gas, not flare it. But recently the Wall Street Journal reported that state officials in North Dakota would be pressing for new regulations because flaring rates there are running around 30%. In the month of April alone, $50 million dollars of natural gas was burned off, completely wasted. The article was discussing shale oil wells, not shale gas ones, but it suggests that, when it comes to controlling flaring, there’s evidence the store is not being adequately minded. (At present, there are no federal regulations at all on flaring.) As long as gas is cheap, the economic incentives to avoid waste are obviously insufficient.
Can’t the leakage and flaring problems be fixed? Our panel spent considerable time discussing this question. Industry representatives said, “Trust us, we’ve been drilling wells for 100 years.” But some of us wondered, “If they haven’t solved this problem in 100 years, why would they suddenly solve it now?”
— excerpted from an article by Naomi Oreskes writing for TomDispatch.com.
To read the complete article, click here.
Naomi Oreskes is professor of the history of science and affiliated professor of earth and planetary sciences at Harvard University.