posted on Oct, 14 2004 @ 03:53 PM
No I was not refering to dating rock. Two different subjects.
Read this :
This note touches on a few questions about radiocarbon dating, especially regarding its reliability, and how we can know whether, and by how much, the
dating may vary. Knowing where C14 comes from is a good start.
C-14 is produced in the atmosphere primarily by thermal neutron interaction with ordinary nitrogen-14 [specifically: N14 + n -> C14 + p]. The rate of
*natural* C-14 production can be increased either by an increase in cosmic radiation (leading to increased neutron production), such as happens with a
solar flare and the cycle from sunspot-minimum to sunspot maximum; or by an increase in atmospheric nitrogen.
Increased cosmic radiation should result in increased amounts of *other* cosmogenic radionuclides such as beryllium-10 (half-life 2,700,000 years) and
chlorine-36 (half-life 380,000 y). The impact of increasing atmospheric nitrogen I leave to the reader's imagination.
"Based on track etch studies of meteorites, the cosmic ray fluence rate has remained more or less constant for at least 2000 years. Studies based on
terrestrial cosmic ray induced and meteoritic radionuclides suggest that the fluence rate had not changed by more than a factor of 2 over the past
10^9 years (UNSCEAR 1977). Maximum levels occurred 700,000 years ago as a result of magnetic field reversals, but only represented a 10 percent
increase in fluence rate."
I referred to natural C14 production above. "Unnatural" C-14 levels rose dramatically in the decades after Hiroshima, thanks to testing fallout; the
peak was 70% greater than that from natural sources, and occurred in 1965. The ratio has been falling off ever since.
The burning of fossil fuels has been *reducing* the C14/C12 ratio for as long as we've been burning them in quantity, releasing into the atmosphere
"old" carbon, C12 -- the C14 having largely decayed away.
Incidentally, "[E]quilibrium is reached between [human] tissue and atmospheric CO2 after about 1.4 year," so if Eldridge's curious source is
right and C14 is increasing "28-37% over its decay rate", we should be seeing that same increase in human tissue -- with its attendant dose
concerns. Ordinarily you can expect that you receive about one mrad per year from cosmogenic (non-fallout) C14 .
"In addition, comparison of radiocarbon dates with tree ring values has shown some fluctuation in the C14 concentration in the atmosphere between
1400 and 1700 BC. Comparison of radiocarbon determined ages with ages of archeological materials accurately established by other methods have revealed
that for the period from 100 BC to 1400, radio- carbon dating gives values that are too large; prior to 100 BC the radiocarbon values are too small.
At about 1600 BC, the radiocarbon values are about 175 years (5%) too small, increasing to about 300 years (6%) at 3000 BC. The discrepancy appears to
be a result of slight variations in the earth's magnetic field over the years, which would alter the cosmic ray intensities and hence C14 production
near the earth. Suitable corrections are available, however, and the useful range of radiocarbon dating is at least 1000 to 100,000 years; in the
range of 1000 to 50,000 years, the time frame of great archeological significance, the uncertainty in the method is less than 5 per cent (Aitken 1974,
Baxter and Walton 1971)." 
 Kathren, Ronald L., _Radioactivity in the Environment: Sources, Distribution, and Surveillance_, 1984, p. 23.
 Ibid., pp. 111-112.
 Ibid., p. 112.
 Ibid., p. 36.
 Ibid., pp. 366-367.