Abstract
A century ago, at the birth of nutrition as a science, the prevailing medical paradigm held that all disease was caused by harmful external influences—bacteria or toxins, principally. The idea that not eating something could make one sick was initially inconceivable. The recognition of the role of thiamin in preventing beri-beri and the discovery of niacin and vitamin A (among other micronutrients) forced a change in that view. But public health measures in the first half of the 20th century eradicated the most extreme of the vitamin deficiencies in the industrialized nations, and the physician’s actual experience of deficiency disease dropped to near zero. Perhaps as a result, the medical profession’s approach to nutrition today is still dominated by the external agent paradigm, as witnessed in the national campaigns for cholesterol, saturated fat, and salt. Those who think more seriously in terms of the continuing importance of deficiency per se are often derogated or relegated to the quackery fringe (1). The result, at the very least, is inattention to the real deficiencies that may masquerade as other disorders, or that may simply be ignored altogether.
Keywords