Many scientists rely on the assumption that radioactive elements decay at constant, undisturbed rates and therefore can be used as reliable clocks to measure the ages of rocks and artifacts. Most estimates of the age of the earth are founded on this assumption. However, new observations have found that those nuclear decay rates actually fluctuate based on solar activity.
In 2009,
New Scientist summarized a mysterious and inadvertent discovery. Brookhaven National Laboratories physicist David Alburger found that the nuclear decay rate of silicon-32 changed with the seasons.1
In a separate but similar instance, Stanford University reported that Purdue physicist Ephraim Fischbach accidentally found that nuclear decay rates sped up during the winter while analyzing data from both Brookhaven and the Federal Physical and Technical Institute in Germany.2
The conclusion was that something from the sun must be affecting the decay rates, and researchers suspect that solar neutrinos may be the cause.
The Sun Alters Radioactive Decay Rates