Does The Sun Change The Rate of Radioactive decay?

Physicists are discovering that what was once thought of as impossible (affecting the rate of radioactive decay), could be very likely possible…

In an experiment, a team at Purdue University detected correlations between fluctuations in nuclear decay rates and Earth-Sun distance. There wasn’t a slight difference but a huge difference. A decrease of 54Mn during a solar flare in the month of December 2006.

Again in the same month, another observation was taken. Same thing happen. A decrease of 54Mn which supports the idea that solar flares do in fact affect the rate of nuclear decay! This isn’t the first time scientists have noticed this.

According to Science News, it wasn’t the first time this was observed, a similar incident was also recorded…

“In a separate paper, also posted online in August, Fischbach, Jenkins and their collaborators compared puzzling and still unexplained results from two separate experiments from the 1980s—one on silicon-32 at the Brookhaven National Laboratory in Upton, N.Y., and the other on radium-226 done at the PTB, an institute that sets measurement standards for the German federal government. Both experiments had lasted several years, and both had seen seasonal variations of a few tenths of a percent in the decay rates of the respective isotopes.”

Even a change of less than half of a percent, would cause many scientists to rethink the whole concept concerning half-life. Even some suggested text books would have to be rewritten in light of the new data.

“…The sun constantly emits neutrinos, subatomic particles produced in the nuclear reactions that power the sun. Neutrinos can move through the entire planet without being stopped, so the sun could affect radioactivity day and night.

The closer to the sun, the denser the shower of neutrinos. Or the sun may emit fewer neutrinos during a solar flare, which would explain the December 2006 event.”

Scientists are now looking at old data in order to see if there are any effects that are connected with the solar flares of the sun. According to science news there is no theory that would have predicted such a discovery like this.

One possible reason why there wouldn’t be a theory, that would have predicted a change in radioactive decay, is because it was believed that it’s not able to change. Carbon-14 dating method is based on radioactive decay.

Generally evolutionists will argue how precise and reliable the dating method (Carbon-14) is based on the assumption that it’s impossible for the radioactive decay rate to be altered. Well, we know for a fact, that isn’t the case and new evidence is out there to back that up, which makes their assumption outdated, not based on facts. I believe, more studies should be done in this area to get a better grasp on the effect the sun does have on radioactive decay.

Advertisements

One thought on “Does The Sun Change The Rate of Radioactive decay?

  1. One thing to note is that Carbon-14 decay rate alteration is not that special. This has been done before, though I can’t remember the reference. However, C14 is a beta decay process. The ones which are assumed to be more static are alpha decay processes, which is what makes these measurements so special. The “long-age” measurements are all done by alpha decay processes, which have shown to be much more stable than beta decay. That’s what makes these measurements so interesting.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s