reply to post by Agree2Disagree
You're right they are not certain what is causing the phenomena. But the only thing they can really postulate that fits into any modern
theories would either be A) solar neutrinos or B) an unknown particle emitted from the sun.
You can feel free to speculate, but those results have yet to be replicated by another lab, so the jury is still out over whether they're really
seeing what they thing they're seeing or not. Until then, it's just that -- speculation.
As far as the decay rates only varying "miniscule" amounts...it doesn't matter at all how much they vary, but that they vary AT ALL. We
thought decay rates were constant...we're now proven wrong.
Have you looked at the literature values for radioisotope half lives? Hardly constant. The reported variations occur based on a 33 day cycle, that of
the solar core. What is being observed is not a net acceleration or deceleration of decay rates over long periods of time, just on a day to day basis
during that cycle.
What kind of implication does that have?
The impact is nonexistent with regard to the long half-life radioisotopes used to date materials. The real impact is in medical use.
What if decay rates varied greatly in the past....What if the amount of solar neutrinos, or "particle X", bombarding the planet were
increased 100 fold....
Decay rates haven’t been observed to change significantly since we’ve started measuring them. We can observe gamma ray frequencies and fading
rates from multiple supernovae that we’ve observed at distances ranging from the hundreds of thousands of light-years to billions of light-years and
those frequencies are accurately predicted by our current terrestrial decay rates. There are other methods for verifying that decay rates are stable,
and the most deviation they've found is 0.000005% over the last two billion years.
No one knows for sure what exactly may have an impact on decay rates...but now that we know they are not "constant" by any means of the
word...it opens the door for more research to be done to find out what exactly does[affect decay rates....]
I agree that it bears further investigation but keep in mind that, even in the wildest interpretation of the data that has actually been published,
seeing significant changes to current published decay rates is unlikely.
I'm merely pointing out that once you think you got a handle on this Universe...it throws your *** a curveball...We don't know 1% of what
there is to know...and even what we think we know, may be wrong...
Agreed, but you seem to stop questioning at "it throws your ass a curveball". Here are the two questions you should be asking:
1. Is the variablity something which we can easily model?
According to the authors of the paper that reported the findings, yes. It's ridiculously easily to model the changes. And they are cyclical.
2. What implications does this have to the applications of radioactive sources currently employed.
For the long half life sources used to give us radiometric dating? They have effectively zero impact.
For shorter half life sources used for medical purposes? They may have significant impact if the data in the original publication can be verified. It
may very well change how nuclear medicine is conducted.