It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Arctic ice-core climate data from the past 800,000 years supports the idea that human activity has had a warming effect that “cancelled most or all of a natural cooling that should have occurred,” William Ruddiman of the University of Virginia said in UVA Today. He explains that during past interglacial periods, carbon dioxide and methane levels decreased, cooling the climate to make the next glacial period possible. But during the past 12,000 years, these gas levels have risen. Ruddiman argues that carbon dioxide levels began rising 7,000 years ago with the burning of forests to clear land for agriculture, and methane levels began rising 5,000 years ago, with the proliferation of livestock farming and early rice irrigation.
“After 12 years of debate about whether the climate of the last several thousand years has been entirely natural or in considerable part the result of early agriculture, converging evidence from several scientific disciplines points to a major anthropogenic influence,” Ruddiman said.
A dozen years ago, Ruddiman hypothesized that early humans altered the climate by burning massive areas of forests to clear the way for crops and livestock grazing. The resulting carbon dioxide and methane released into the atmosphere had a warming effect that “cancelled most or all of a natural cooling that should have occurred,” he said.
That idea, which came to be known as the “early anthropogenic hypothesis” was hotly debated for years by climate scientists, and is still considered debatable by some of these scientists. But in the new paper, Ruddiman and his 11 co-authors from institutions in the United States and Europe say that accumulating evidence in the past few years, particularly from ice-core records dating back to 800,000 years ago, show that an expected cooling period was halted after the advent of large-scale agriculture. Otherwise, they say, the Earth would have entered the early stages of a natural ice age, or glaciation period.
In 2003, Ruddiman developed his early anthropogenic hypothesis after examining 350,000 years of climate data from ice cores and other sources. He found that during interglacial periods, carbon dioxide and methane levels decreased, cooling the climate and making way for a succeeding glacial period. But, only during the Holocene era, these gas levels rose, coinciding, he said, with the beginning of large-scale agriculture. He attributed the rise to this human activity, which began occurring millennia before the industrial era.
originally posted by: butcherguy
I can believe that 'slash and burn' farming might have had a tiny influence on the CO2 levels. But wouldn't wildfires caused by lightning have had the same effect, since there were no firefighters to put the fires out? During drought periods, areas the size of midwestern states could have burned.
originally posted by: Mianeye
a reply to: butcherguy
It isn't just a small forest we are talking about, in the 7000 years humans have cut or cleared 80% of the worlds forest, there is today 20 percent left, and we are still clearing woods big time.
"Natural" forest fires only burn a small % of forest, even out of control fires.
Prehistorically, it is estimated that from 5.5 to over 19 million acres burned on the average each year, and data are presented for each vegetation group. Wildfire records show a decreasing trend in wildfire a~reage until the late 1960' s but an increasing trend since then, with more structures destroyed in the first half of the 1990' s than iii the previous seven decades. The difference in fire occurrence between prehistoric and recent times means a change in the four aspects of fire regimes -- period between fires, severity, seasonality, and dimensionality. Today there is less pyrodiversity, leading to a potential decrease in biodiversity.