Abstract
Low doses in the mGy range cause a dual effect on cellular DNA. One is a relatively low probability of DNA damage per energy deposition event and increases in proportion to the dose. At background exposures this damage to DNA is orders of magnitude lower than that from endogenous sources, such as reactive oxygen species. The other effect at comparable doses is adaptive protection against DNA damage from many, mainly endogenous, sources, depending on cell type, species and metabolism. Adaptive protection causes DNA damage prevention and repair and immune stimulation. It develops with a delay of hours, may last for days to months, decreases steadily at doses above about 100 mGy to 200 mGy and is not observed any more after acute exposures of more than about 500 mGy. Radiation-induced apoptosis and terminal cell differentiation also occur at higher doses and add to protection by reducing genomic instability and the number of mutated cells in tissues. At low doses reduction of damage from endogenous sources by adaptive protection maybe equal to or outweigh radiogenic damage induction. Thus, the linear-no-threshold (LNT) hypothesis for cancer risk is scientifically unfounded and appears to be invalid in favour of a threshold or hormesis. This is consistent with data both from animal studies and human epidemiological observations on low-dose induced cancer. The LNT hypothesis should be abandoned and be replaced by a hypothesis that is scientifically justified and causes less unreasonable fear and unnecessary expenditure.