More than two dozen scientists have spent at least six years debating whether humanity's wear and tear on the planet qualifies as a new geological epoch that deserves its own name.
The origins of coal date back to the Carboniferous Period 350 million years ago. Dinosaurs roamed the earth until a meteor brought an end to their Cretaceous Period 66 million years ago. Civilization grew up in the Holocene, which started only 11,700 years ago. Now, these researchers argue, human industry and population have created the Anthropocene, or human epoch. Essentially, they argue, over the past 75 years or so we've installed a new operating system for our 4.5 billion-year-old planet.
That's the subject of the authors' new comprehensive assessment in the journal Science. “Human activity is leaving a pervasive and persistent signature on Earth,” the authors declare in the introduction. Human-wrought change is detectable everywhere people live, and most places we don't. What's inadvertently shocking about the paper is that it assembles in one place the effects of population growth, industrial resource use, fossil-fuel burning, and agriculture.
It's not meant to read like an indictment—it's a research article, after all—yet the evidence of global change caused by humans speaks for itself. Here are the most striking examples.
1. Filling world with waste
Humans have invented more new kinds of minerals—“pottery, glass, bricks, and copper alloys,” to name a few—than the earth has seen since oxygen-producing bacteria evolved 2.4 billion years ago. New chemicals line our landfills, mines, rivers, and cities. Industry produces about 300 million tons of plastic every year, according to the research paper, or about the same mass as all 7.3 billion humans put together. Tiny bits of plastic, synthetic fiber, and the “virtually ubiquitous” microbeads found in cosmetics have coated the surface of the earth in organic polymers. Geology doesn’t really discriminate between polymers that come from living things and from factories. Toss a soda bottle cap into mud, and it may fossilize over time the way ancient leaves or critters do.
2. A scorched earth
More than half the earth’s surface has been transformed into settlements and cities, agricultural land, mines, waste dumps, baseball diamonds, and beyond. Mineral mining moves three times more sediment every year than all the world's rivers. Change “is increasingly extending into the oceans,” the authors write, both directly, as trawlers rip up sea-floor ecosystems, and indirectly, where agricultural runoff produces oxygen-poor “dead zones.”
3. The long tail of nuclear bomb tests
Thermonuclear weapon tests—hydrogen bombs—produced radioactive fallout between the early 1950s and early 1960s that may serve as a geological time marker in sediment and ice for any geologists conducting fieldwork a million years hence. Bomb blasts produce new radioactive particles. The spike in carbon-14, a radioactive carbon isotope, between 1954 and 1964 might provide a suitable peg, or possibly even better, plutonium, which may stick around in sediment and ice for 100,000 years before decaying to uranium and then lead.
4. Speeding up earth's chemical conveyor belt
CO2 is famously entering the atmosphere about 100 times faster than it did when the planet emerged from the most recent ice age, about 12,000 years ago. The concentration of CO2 in the atmosphere is 35 percent higher than its peak for the last 800,000 years. Sea-levels are higher than they’ve been in 115,000 years, and the rise is accelerating. A century of synthetic-fertilizer production has disrupted the earth's nitrogen cycle more dramatically than any event in 2.5 billion years.
5. Get ready for the sixth mass extinction
Life on earth has survived five major extinctions over the past half-billion years, and many smaller die-offs. “Current trends of habitat loss and overexploitation, if maintained, would push Earth into the sixth mass extinction event”—meaning more than 75 percent of species gone within several centuries—“a process that is probably already underway.”
Having sifted through the published evidence, the Anthropocene Working Group faces a no less daunting task, which is to choose how best to define the Human Epoch. When did it start? Which of these developments best characterize it? Several possibilities present themselves, each with pluses and minuses. To formally name a new epoch, the geology community usually requires a specific reference point somewhere in the world, in rock. They call it a “golden spike.” That’s tricky in the case of the Anthropocene, because the changes likeliest to stand the test of time aren’t rock yet—they’re in goopy sedimentary muck or ice.
A likely definition, which the group may propose to an official geology body late this summer, centers on what they call the Great Acceleration, the period after WWII, when human population, technology, and resource use all took off, when nuclear-test fallout settled out, and when industrial plastics started to become commonplace. "Defining when to begin the Anthropocene is controversial, as a lot rides on it," said Simon Lewis of the University College of London, who co-wrote a paper in Nature in March that pegs the beginning of the Anthropocene to 1610, the onset of colonialism. "Any definition will inform the stories that we tell about human development."
Whether or when geologists formally embrace the Anthropocene remains to be seen. Perhaps naming the world after humans’ collective impact will lead nations to more “pervasive and persistent”—and productive—correctives to all the rough news.