The Cold We Caused

by Steven Stoll

Harper's Magazine Notebook (November 2009)


It's not whether or not we're going through a global warming period. We
were. We're not now. You know God's still up there. We're now going
through a cooling spell. And the whole issue there was, Is it man-made
gases, anthropogenic gases, carbon dioxide, methane. I don't think so.
-- Senator James Inhofe (Republican, Oklahoma), October 7 2008


Rather than continue to reject four centuries of scientific thought as an
atheist conspiracy, skeptics of climate change now concede the effect and
attribute the cause to a remote but still vaguely engaged Creator. For the
nearly incoherent senator from Oklahoma - the ranking Republican on the
Committee on Environment and Public Works - a March snowstorm in Tulsa
proves that the Almighty has recently changed plans. Others of this cohort
consult the Gospels on the future extent of flooding and call any
suggestion of a human factor in global warming "arrogance". Senator
Inhofe's humility would be philosophical if it weren't a charade meant to
protect the economic system that is his real religion. Moreover, his
tirades would be irrelevant but for the many citizens who agree with him.
According to a poll by the Pew Research Center, 38 percent of those polled
and 49 percent of white evangelicals believe either that climate change is
a hoax or that humans are not responsible for it.

Inhofe might do well to ponder the recent discovery of one scientist at
the University of Virginia whose findings amount to a natural experiment:
What would happen to carbon dioxide and methane if humans disappeared? The
answer is a tale not of drought but of rain, not of warming but of
cooling. Its relevant technology consisted not of internal combustion and
the steam engine but of horse collars and the moldboard plow. Excavate the
Middle Ages, and one unearths a geological event with enormous
implications for how we think about and respond to climate.

Alpine people told of glaciers crushing villages. The growing season
throughout northern Europe suddenly shortened by two months. Torrential
rains and flooding at harvest devastated crops repeatedly throughout
northern Europe during the sixteenth and seventeenth centuries. Vikings
arrived in Greenland late in the tenth century, at a time when they could
plant wheat, but around 1350 the last residents ate their dogs before
disappearing into the ice that had engulfed the southern tip of the
island. As late as 1665, Norwegian wheat fields yielded just seventy
percent of what they had produced in 1300. Cattle died on snow-covered
pastures; wine and olive-oil production shifted south. A general sense of
scarcity impelled agrarian people outward. Thousands of Europeans migrated
to North America seeking relief, only to find the same impenetrable cold.
Travelers and naturalists suspected for a century what geologists can now
measure: the Northern Hemisphere fell into a frigid rut around 1350 that
lasted until the nineteenth century.

This so-called Little Ice Age was not an ice age. The periodicity of ice
ages follows long-term cycles in the position, relative to the sun, of
Earth - the 100,000-year eccentricity of its orbit, the 41,000-year
pendulum of its axis, and the wobble-like rotation it makes every 22,000
years. (This last to-and-fro, called precession, contributed to the most
recent glacial maximum, known to us as the Ice Age.) On a graph these
phases look like heartbeats on an EKG, except with a scale of hundreds of
thousands of years. Glaciers heave and evaporate with pulsing regularity.
Yet whereas the planetary physics of the Big Cold are fairly clear, that
of the Little Ice Age are much less so. The latter came and went in a
geological instant, too brief to have been caused by the wobble of Earth.

Although we think of carbon dioxide and methane as the exhalations of
industrial production, air bubbles trapped in Arctic ice say otherwise. In
fact, the increase started 8,000 years ago, with the proliferation of
agriculture. Forests burned by humans, whether for fuel or to clear for
planting, released tons of stored carbon. Three thousand years later, rice
paddies and domesticated cattle began to let off methane, which traps heat
twenty-one times better than carbon dioxide does. And yet temperatures
barely changed during these eight millennia. But around 1400, gases and
temperatures both plunged, then recovered, then plunged again after 1500.
Atmospheric carbon declined by a statistically significant ten parts per
million (ppm) in a period only slightly longer than a single human
lifetime.

For decades, climate scientists have been unable to offer an adequate
explanation for this drop. One geochemist has theorized that deep ocean
currents flipped their distribution of warm and cold water, altering the
heat exchange between atmosphere and ocean, though on closer inspection
this looked more like an effect than a cause. The 1815 eruption of Mount
Tambora, in Indonesia, injected 150 million tons of sulfuric ash into the
atmosphere, blocking sunlight and creating "the year without a summer" -
months of gloom and shadow, snow flurries across New England during June
and July. But volcanoes cool for only a year or two. Sunspots, another
suspect, follow an eleven-year cycle, and some geologists have favored
them as a catalyst for the cold. But, like volcanoes, sunspots cannot
possibly explain century after century of prolonged winter, and neither
accounts for the missing ten ppm.


William Ruddiman, the environmental scientist at Virginia, has hit upon a
solution to the riddle that is biological instead of geological. Even the
best climate models, he realized, had ignored humans. Economic activity
until the eighteenth century consisted of farming, hunting, handicrafts,
and trading. To whatever degree these practices affected the global carbon
budget, they did nothing to lower it. But Ruddiman saw that humans did
something else during these centuries. They died in tremendous numbers.

The Little Ice Age coincided with a series of astonishing pandemics. The
best documented began in October of 1347. Twelve galleys, heavy with trade
from Caffa, sailed into the port of Messina, their Genoese crews groaning
from sickness in the holds. At least one came ashore. The first Sicilian
he greeted received a bacterial infection from a sneeze or a vaulting
flea. A week later the Genoese sailor was dead, along with every Sicilian
who had stood close enough to smell his breath. The afflicted buckled with
fever and hard, bleeding ulcers: the first European victims of the Black
Death.

Over the next four years, as many as fifty million people perished -
representing half the inhabitants of Italy, the Balkans, Hungary,
Lithuania, Germany, France, Spain, Ireland, Scotland, England, Norway,
Sweden, Syria, and Palestine, as well as parts of Egypt, the Ottoman
Empire, and Azerbaijan. So many Italians died so quickly that rotting
corpses lay for days and weeks until the smell was unendurable. Entire
families needed to be dragged to doorsteps so that undertakers could carry
them away. "No tears or candles or mourners" ushered off the dead, wrote
Boccaccio; "in fact, no more respect was accorded to dead people than
would nowadays be shown toward dead goats". Bodies filled hasty trenches,
"stowed tier upon tier like ships' cargo".

The social order unraveled. People became scarce. Peasants abandoned the
manorial estates, leaving them without workers to sow or harvest. Wages
increased due to the shortage of hands, and lords felt compelled to
compete for workers by promising more money and greater freedom. One
region in southern France did not regain the population it counted in 1300
for hundreds of years. Lands reclaimed by proliferating farmers in the
eleventh century - mountainsides and wetlands - lost their human presence
altogether. Ruined drainage canals on France's Mediterranean coast hummed
with malarial mosquitoes. Wild game returned to the Massif Central (the
upland plateau that borders the Rhone Valley to the east), including
bears, boars, wolves, and partridges as thick on the ground as chickens.
The aristocracy gave French peasants unrestricted hunting privileges in
order to control the animal invasion.

Historians have toured this shattered landscape for centuries, on foot or
in low-flying aircraft, from which buried fence lines and church
foundations appear like impact craters. Yet by the time scholars
discovered burial records and other evidence of the catastrophe, the
living landscape yielded few clues. The human and animal populations had
rebounded, leaving them no way of knowing that by 1400, woody growth had
occupied at least 25 percent - and perhaps as much as 45 percent - of
arable Europe. Birches and hazels squatted in the ecological real estate
left vacant by human loss.

And the Black Death was not the only pandemic of the late Middle Ages. The
same bacterium bad arrived in China a decade before, killing perhaps as
many as fifty million. When Hernan Cortes invaded the Valley of Mexico in
1519, his armies brought smallpox, influenza, and mumps, setting off among
never-before-exposed people a series of devastating infections that, as
the diseases moved north and south, killed between fifty and sixty million
over the following two hundred years. The destruction of life cut so
deeply into Indian societies that many never recovered their earlier
populations. These New World pandemics rolled on for centuries, causing a
decline with greater implications for the atmosphere than even the
European mortality. Globally, an estimated 125 million people died of
pandemic disease between 1200 and 1750, representing 25 percent of the
total population in 1500.

According to Ruddiman's hypothesis, the deaths of so many in such a short
time, over terrain extending from the Po Valley to the Incan Empire, left
hundreds of millions of hectares abandoned to reforestation. The
rebounding woodland devoured 13.8 billon tons of carbon, accounting for
more than half the missing ten ppm. The oceans ate the rest, probably as
part of a feedback loop set off by the die-off. (Cold water stores more
carbon dioxide than warm water, so falling temperatures would have created
an ever more efficient carbon sink, leading to falling temperatures,
before something broke the loop.)

How the Little Ice Age ended is perhaps even more revealing than how it
began. As population lurched toward recovery, settler cultures felt the
tension between lands and hands, sending ax-wielding farmers into the
forests of Massachusetts, the Volga River Valley, and Manchuria. Between
1700 and 1920 the world's forests lost 537 million hectares, as agrarian
societies increased their land use more than threefold. The carbon in all
of those trees - together with soil itself, the greatest source on the
surface of Earth - wafted up to thicken the eight-mile-high envelope that
distinguishes this planet from Mercury. The world counted few coal-burning
factories in 1850, but their numbers followed an accelerating curve as
petroleum joined coal to provide the hydrocarbons that would generate two
more centuries of economic growth. Under the new energy regime,
atmospheric carbon levels rose by 100 ppm between 1750 and the present.


The Little Ice Age might seem to provide ideological fuel to our Inhofes,
in that it shows an earlier and opposite shift in climate - an event that,
moreover, lay beyond human control, like an earthquake or meteorite. In
fact, however, this medieval tale reveals the enormous capacity of human
beings to shape their environment, whether unwittingly or deliberately. If
our crop-planting, animal-herding, forest-and- savannah-burning ancestors
could trigger the rapid cooling of the atmosphere through their sudden
absence, then we can achieve the same effect by abandoning other
practices. The cold we caused does more damage to Inhofe's position than
any finding by the Intergovernmental Panel on Climate Change.

But the significance of far-reaching events rarely lies where we please.
Evangelizing environmentalists, much like evangelical Christians, have too
often held humanity to an impossible standard, impossible for being
imaginary. Some see Earth before agrarian humans as an Arcadia against
which we can assess the sins of industrial society. The world without us,
in this view, would return to its true temperature, its steady state.
Growing up as a species, however, means accepting that we are neither
blessed by Heaven nor shatterers of the natural order. No such order
exists: no true or natural climate, no normal rate of extinction, no ideal
ecology. The only thing normal about climate is its propensity to slam
back and forth between, maxima and minima, between infernal winters and
refrigerated summers. In the oscillating dance of the glaciers, species
die. Whether they die from meteorites or from billions of human decisions
makes no difference. Either way, they leave behind abandoned niches - the
ecological spaces organisms inhabit - resulting in evolutionary cascades
of new species. Everything alive is matter in one momentary form, soon to
take some other momentary form.

The Little Ice Age negates the notion of a pristine planet, insisting
instead on a more clearly defined place for humans in the grand narrative
of Earth's history. In 2000, the chemist Paul Crutzen named our current
era the Anthropocene, a period in which humans figure on a geological
scale. But whereas many date the Anthropocene to the advent of
coal-burning machines, Ruddiman's work lengthens the period to begin when
the first stick plow cut the first furrow. It was the invention of
agriculture that marked the beginning of the Anthropocene, because that
was when terrestrial environments and atmospheric chemistry became
artifacts of human culture; when our imprint on the climate became like
fossilized footprints in volcanic ash. The Anthropocene has been an era of
extinction to rival that of the Cretaceous, and will be defined by the
present moment, when our demonstrable power to foster or erode the
diversity of life can no longer be handed over to God or Nature.

During the nineteenth century, American naturalists and politicians denied
that any animal ever had or ever could become extinct. Confronted by
evidence of the eradication of the passenger pigeon and the near
destruction of the bison, the deniers developed absurd theories to explain
away the obvious: that decades of intense commercial hunting had wiped out
or endangered countless species. Today, at stake in the denial of climate
change is the extinction or impediment of billions of the world's poor.
One billion now live in slums - a number increasing every day and soon to
surge as rising sea levels cause others to become environmental refugees.
The impoverished cannot afford the higher cost of food when coastal land
ceases production. They will have to move when wells and rivers dry up.
Once we accept the human capacity to reconfigure the climate, the rich
nations will become directly responsible for the suffering of the poor.
The litany of rationalizations regarding the global poor - including their
own low morality, a punishing God, and the Darwinian workings of the
market - might finally buckle and break once the very poor file
class-action suits against the wealthy nations for reckless carbon output.

Taking responsibility for the consequences of agriculture, capitalism, and
industrialism is not the same thing as believing that humans control the
tilt and wobble of the globe. It requires us to wield what control we have
through public policy. The recent decision by the Environmental Protection
Agency to regulate greenhouse gases as a public-health risk under the
Clean Air Act does more than rebuke the deniers. It represents an
extraordinary shift in American perception. It heralds the end, or so we
can hope, of an approach to our atmosphere - as an infinite sink - that
has financed industrial capitalism since soot turned the birds black in
Manchester. We can only hope that the people of the most polluting nation
will finally ask some meaningful questions. What is a just climate and
what an unjust one? Which climate represents the insatiable demands of
corporate growth rather than the health and stability of everyone else? By
confirming the human role in climate change, and by declaring a warming
world injurious to the public good, the EPA has swung a club against
perhaps the grandest capitalist conceit of the twentieth century: that
society forms part of the economy, not the other way around.

Ecology becomes policy when our responsibility becomes undeniable. So
completely cultivated, walked over, and settled up is our planet that it
no longer makes sense to regard any part of it as lying beyond human
influence. To paraphrase Stewart Brand, we are the climate, and we might
as well get good at it.

_____

Steven Stall is an associate professor of history at Fordham University
and the author of The Great Delusion: A Mad Inventor, Death in the
Tropics, and The Utopian Origins of Economic Growth (2008). His last
article for Harper's Magazine, "Fear of Fallowing", appeared in the March
2008 issue.

_______________________________________________
Marxism-Thaxis mailing list
Marxism-Thaxis@lists.econ.utah.edu
To change your options or unsubscribe go to:
http://lists.econ.utah.edu/mailman/listinfo/marxism-thaxis

Reply via email to