Global warming has achieved the status of a major threat. It inspires nightmares of a troubled future and propels apocalyptic dramas such as the summer 2004 movie The Day After Tomorrow. Even were the Kyoto treaty to be fully implemented, it wouldn’t make a dent in the warming trend, which seems to be inexorable. Doom is upon us.
Except that maybe it isn’t. You might not know it from ordinary media accounts, which report the judgments of alarmists as “settled science,” but there is a skeptical side to the argument. Scientists familiar with the issues involved have written critically about the theory of global warming. The puzzle is why these commentators, well-credentialed and experienced, have been swept aside to produce a false “consensus.” What is it that produces widespread agreement among both “experts” and the general public on a hypothesis which is quite likely wrong?
The consensus holds that we are experiencing unprecedented global warming and that human activity is the main culprit. The past century, we are told, has been the hottest on record, with temperatures steadily rising during the last decades. Since human population and industrial activity have risen at the same time, it stands to reason that human activity is, one way or another, the cause of this observed warming. Anything wrong with this reasoning?
Quite a lot, as it turns out. The phrase “on record” doesn’t mean very much, since most records date from the latter part of the nineteenth century. Without accurate records there are still ways of discovering the temperatures of past centuries, and these methods do not confirm the theory of a steady rise. Reading tree rings helps (the rings are further apart when the temperature is warmer and the trees grow faster). Core samples from drilling in ice fields can yield even older data. Some historical reconstruction can help, too—for example, we know that the Norsemen settled Greenland (and named it “green”) a millennium ago and grew crops there, in land which is today quite inhospitable to settlement, let alone to agriculture. Other evidence comes from coral growth, isotope data from sea floor sediment, and insects, all of which point to a very warm climate in medieval times. Abundant testimony tells us that the European climate then cooled dramatically from the thirteenth century until the eighteenth, when it began its slow rewarming.
In sum, what we learn from multiple sources is that the earth (and not just Europe) was warmer in the tenth century than it is now, that it cooled dramatically in the middle of our second millennium (this has been called the “little ice age”), and then began warming again. Temperatures were higher in medieval times (from about 800 to 1300) than they are now, and the twentieth century represented a recovery from the little ice age to something like normal. The false perception that the recent warming trend is out of the ordinary is heightened by its being measured from an extraordinarily cold starting point, without taking into account the earlier balmy medieval period, sometimes called the Medieval Climate Optimum. Data such as fossilized sea shells indicate that similar natural climate swings occurred in prehistoric times, well before the appearance of the human race.
Even the period for which we have records can be misread. While the average global surface temperature increased by about 0.5 degrees Celsius during the twentieth century, the major part of that warming occurred in the early part of the century, before the rapid rise in human population and before the consequent rise in emissions of polluting substances into the atmosphere. There was actually a noticeable cooling period after World War II, and this climate trend produced a rather different sort of alarmism—some predicted the return of an ice age. In 1974 the National Science Board, observing a thirty-year-long decline in world temperature, predicted the end of temperate times and the dawning of the next glacial age. Meteorologists, Newsweek reported, were “almost unanimous in the view that the trend will reduce agricultural productivity for the rest of the century.” But they were wrong, as we now know (another caution about supposedly “unanimous” scientific opinion), and after 1975 we began to experience our current warming trend. Notice that these fluctuations, over the centuries and within them, do not correlate with human numbers or activity. They are evidently caused by something else.
What, then, is the cause of the current warming trend? As everyone has heard, the emission of so-called “greenhouse gasses,” mostly carbon dioxide from burning fossil fuels, is supposed to be the major culprit in global warming. This is the anthropogenic hypothesis, according to which humans have caused the trouble. But such emissions correlate with human numbers and industrial development, so they could not have been the cause of warming centuries ago, nor of the nineteenth-century rewarming trend which began with a much smaller human population and before the industrial revolution. Nor is there a very good correlation between atmospheric carbon dioxide levels and past climate changes. Thus, to many scientists, the evidence that greenhouse gasses produced by humans are causing any significant warming is sketchy.
The likeliest cause of current climate trends seems to be solar activity, perhaps in combination with galactic cosmic rays caused by supernovas, especially because there is some good observable correlation between solar magnetism output and terrestrial climate change. But that kind of change is not predictable within any usable time frame, not yet anyway, and, of course, it is entirely beyond any human influence. The conclusion, then, is that the climate will change naturally; aside from altering obviously foolish behavior, such as releasing dangerous pollutants into our air and water, we can and should do little more than adapt to these natural changes, as all life has always done.
That is not a counsel of despair, however, for global warming is not necessarily a bad thing. Increasing warmth and higher levels of carbon dioxide help plants to grow (carbon dioxide is not a pollutant), and, indeed, mapping by satellite shows that the earth has become about six percent greener overall in the past two decades, with forests expanding into arid regions (though the effect is uneven). The Amazon rain forest was the biggest gainer, despite the much-advertised deforestation caused by human cutting along its edges. Certainly climate change does not help every region equally and will probably harm some. That has always been true. But there are careful studies that predict overall benefit to the earth with increasing warmth: fewer storms (not more), more rain, better crop yields over larger areas, and longer growing seasons, milder winters, and decreasing heating costs in colder latitudes. The predictable change, though measurable, will not be catastrophic at all—maybe one degree Celsius during the twenty-first century. The news is certainly not all bad, and may on balance be rather good.
There is much more, in more detail, to the argument of those scientists who are skeptical about the threat of global warming. On the whole, their case is, I think, quite persuasive. The question, then, is why so few people believe it.
Part of the answer is that bad news is good news—for the news media. The media report arresting and frightening items, for that is what draws listeners, viewers, and readers. The purveyors of climate disaster theories have exploited this journalistic habit quite brilliantly, releasing steadily more frightening scenarios without much significant data to back them up. Consider the unguarded admission of Steven Schneider of Stanford, a leading proponent of the global warming theory. In a now notorious comment, printed in Discover in 1989 and, surely to his discomfort, often cited by his opponents, Schneider admitted:
To capture the public imagination, we have to offer up scary scenarios, make simplified dramatic statements, and make little mention of any doubts we may have. Each of us has to decide what the right balance is between being effective and being honest.
This sort of willingness to place the cause above the truth has exasperated Richard Lindzen, Sloan Professor of Meteorology at MIT, who is one of the authors of the science sections of the report of the International Panel on Climate Change (IPCC), the body responsible for an increasing crescendo of dire warnings. In testimony before the U.S. Senate’s Environment and Public Works Committee, he called the IPCC’s Summary for Policymakers, which loudly sounds the warming alarm, “very much a child’s exercise of what might possibly happen . . . [which] conjures up some scary scenarios for which there is no evidence.”
This brings us to the second part of the answer, which concerns the political and economic consequences of the policy argument. The IPCC is a UN body and reflects UN politics, which are consistently favorable to developing countries, the majority of its members. Those politics are very supportive of the Kyoto treaty, which not only exempts the developing countries from emissions standards but also requires compensatory treatment from the wealthier nations for any economic restraints that new climate management policies may impose on these developing countries. Were Kyoto to be implemented as written, the developing countries would gain lots of money and free technology. One need not be a cynic to grasp that a UN body will do obeisance to these political realities wherever possible.
The Kyoto treaty would not make a measurable difference in the climate—by 2050, a temperature reduction of maybe two-hundredths of a degree Celsius, or at most six-hundredths of a degree—but the sacrifices it would impose on the United States would be quite large. It would require us to reduce our projected 2012 energy use by 25 percent, a catastrophic economic hit. Small wonder that the Senate in 1997 passed a bipartisan resolution, the Byrd-Hagel anti-Kyoto resolution, by 95-0 (a fact rarely recalled by those who claim that America’s refusal to sign on to the treaty was the result of the Bush administration’s thralldom to corporate interests).
Most of the European countries that have ratified Kyoto are falling behind already on targets, despite having stagnant economies and falling populations. It is highly unlikely they will meet the goals they have signed on for, and they know it. Neither will Japan, for that matter. The European Union has committed itself to an eight percent reduction in energy use (from 1990 levels) by 2012, but the European Environment Agency admits that current trends project only a 4.7 percent reduction. When Kyoto signers lecture non-signers for not doing enough for the environment, they invite the charge of hypocrisy. There is also the obvious fact that adherence to the treaty will hurt the U.S. economy much more than the European, which suggests that old-fashioned economic competitiveness is in the mix of motives at play here. The absurdity of the treaty becomes obvious when we recognize that it does not impose emissions requirements on developing countries, including economic giants such as China, India, and Brazil. (China will become the world’s biggest source of carbon dioxide emissions in just a few years.)
A third reason why global warming fears seem to be carrying the day goes beyond these political interests; it involves intellectual pride. Academics are a touchy tribe (I’m one of them); they do not take it kindly when their theories, often the result of hard work, are contradicted. And sure enough, the struggle for the truth in this matter is anything but polite. It is intellectual warfare, entangled with politics, reputations, and ideology; and most of the anger comes from the side of the alarmists. People lose their tempers and hurl insults—”junk science,” “willful ignorance,” “diatribe,” “arrogant,” “stupid,” “incompetent,” “bias,” “bad faith,” “deplorable misinformation,” and more. Consider the fiercely hateful reaction to Bjorn Lomborg’s 2001 book, The Skeptical Environmentalist. He challenged the entrenched and politically powerful orthodoxy and did so with maddeningly thorough data. His critics, unable to refute his statistics, seem to have been enraged by their own weakness—a familiar phenomenon, after all. Or perhaps, with their reputations and their fund-raising ability tied to the disaster scenarios, they felt their livelihoods threatened. In any case, the shrillness of their voices has helped to drown out the skeptics.
Finally, there is a fourth cause: a somewhat murky antipathy to modern technological civilization as the destroyer of a purer, cleaner, more “natural” life, a life where virtue dwelt before the great degeneration set in. The global warming campaign is the leading edge of an environmentalism which goes far beyond mere pollution control and indicts the global economy for its machines, its agribusiness, its massive movements of goods, and above all its growing population. Picking apart this argument to show the weakness of its pieces does not go to the heart of the fear and loathing that motivate it. The revulsion shows in the prescriptions advanced by the global warming alarmists: roll back emissions to earlier levels; reduce production and consumption of goods; lower birth rates. Our material ease and the freedoms it has spawned are dangerous illusions, bargains with the devil, and now comes the reckoning. A major apocalypse looms, either to destroy or, paradoxically, to save us—if we come to our senses in the nick of time.
It is clear, then, given the deep roots of the scare, that it is likely to be pretty durable. It has the added advantage of not being readily falsifiable in our lifetimes; only future humans, who will have the perspective of centuries, will know for certain whether the current warming trend is abnormal. In the meantime, the sanest course for us would be to gain what limited perspective we can (remembering the global cooling alarm of a generation ago) and to proceed cautiously. We are going through a scare with many causes, and we need to step back from it, take a long second look at the scientific evidence, and not do anything rash. Though the alarmists claim otherwise, the science concerning global warming is certainly not settled. It is probable that the case for anthropogenic warming will not hold up, and that the earth is behaving as it has for millennia, with natural climate swings that have little to do with human activity.
Thomas Sieger Derr has been writing on environmental ethics for many years. He is Professor of Religion and Ethics at Smith College and the author of Environmental Ethics and Christian Humanism.
You have a decision to make: double or nothing.
For this week only, a generous supporter has offered to fully match all new and increased donations to First Things up to $60,000.
In other words, your gift of $50 unlocks $100 for First Things, your gift of $100 unlocks $200, and so on, up to a total of $120,000. But if you don’t give, nothing.
So what will it be, dear reader: double, or nothing?
Make your year-end gift go twice as far for First Things by giving now.