The nutritional history of the mineral elements, unlike the vitamins, had little early focus on humans. Rather, it was domestic livestock eating forage from mineral-poor soils that exhibited deficiency symptoms (thought at first to be due to toxi-city). Typical signs were the crimping of wool in sheep, aortic rupture in pigs and cattle, and loss of myelin in brains of newborn lambs. Symptoms were lessened sharply by supplementing the feed with salts of metal ions such as CuSO4, Fe(NH4)2(SO4)2, and ZnCl2. Reversing symptoms and reestablishing optimal growth to livestock provided the first evidence for essential metals. In time, biochemical studies led to the isolation of enzymes that required metal ions for function, and soon after that specific enzymes could be linked to the deficiency symptoms. Metal ion interactions were viewed as detrimental as well as valuable to the system. An early study by Hart et al. showed that copper potentiated the effects of iron in alleviating an anemia condition in laboratory rats fed milk-based diets. That observation was repeated in chicks and pigs and soon attracted the attention of clinicians who adopted a similar bimetal protocol in the treatment of anemic humans. Coupled with the advent of semipurified diets in that same era, the science of nutrition stood on the threshold of major discoveries as to the roles of the essential mineral elements.
Was this article helpful?