Radioactive Dating

Geologists use radioactive dating techniques to estimate the age of rocks and fossils. These are based on the concept that the rocks contain a small concentration of a radioactive substance which decays at a fixed rate. As it decays, the concentration of radioactive atoms decreases, with a corresponding proportional decrease in the level of emitted radiation. The level of radiation in a rock or fossil will therefore (supposedly) provide a measure of its age.

I remember being taught about radioactive dating by my physics teacher at school, long before I had ever considered questions of creation versus evolution. Even at that time I asked myself the question as to how could one possibly know how much radioactivity was in the substance to start with.

A living organism (plant or animal) is constantly interacting with its environment and one can reasonably assume that the level of radioactivity in the organism will be dictated by the corresponding levels in its surroundings. On fossilisation, this interaction with the environment will be frozen and an observable decay process will then set in. If we know the levels of radiation in the environment at the outset, then theoretically we can determine the age of a rock or fossil.

It has to be stressed however that such techniques are based on the fundamental assumption that the levels of radioactivity in the environment have always been the same. Can we really be sure that this is the case?

If for any reason, radiation levels were once lower than they are today, then rocks and fossils would appear to be older than they really are. This fits in with the Biblical account of creation and can be explained by the fact that major ecological changes would have taken place at the time of the Flood.