Radiometric dating is based on
No geologist uses radioactive dating to do more than confirm dates established by the fossils using Darwin's theory of descent with modification. Radiometric dating is a method of determining the age of an artifact by assuming that on average decay rates have been constant (see below for the flaws in that assumption) and measuring the amount of radioactive decay that has occurred.Recognizing this problem, scientists try to focus on rocks that do not contain the decay product originally.For example, in uranium-lead dating, they use rocks containing zircon (Zr Si O Zircon and baddeleyite incorporate uranium atoms into their crystalline structure as substitutes for zirconium, but strongly reject lead.One key assumption is that the initial quantity of the parent element can be determined.
That is, electrons can move closer to or farther away from the nucleus depending on the chemical bonds.Another assumption is that the rate of decay is constant over long periods of time.Radiometric dating requires that the decay rates of the isotopes involved be accurately known, and that there is confidence that these decay rates are constant. The physical constants (nucleon masses, fine structure constant) involved in radioactive decay are well characterized, and the processes are well understood.This affects the coulomb barrier involved in Alpha decay, and therefore changes the height and width of the barrier through which the alpha particle must tunnel.The effect of this on alpha decay, which is the most common decay mode in radiometric dating, is utterly insignificant.