The Contribution of Drinking Water to Mineral Nutrition in Humans
GENERAL CONSIDERATIONS OF MINERAL INTAKE FROM WATER
The initial undertaking of the first Safe Drinking Water Committee (SDWC) was the identification of substances and their concentrations in the nation's water supply that might pose risks to the public health and, therefore, require the setting of limits. The committee's report, Drinking Water and Health (National Academy of Sciences, 1977) contained gaps for which data were not available or were just emerging at the time the report was written. In other cases, the data were not reviewed in depth because the specific substances were not considered pertinent to the initial charge of the committee, i.e., identification of adverse consequences of various substances in water.
One such area was that of nutrients, known to be essential or strongly suspected as being necessary for optimal health of humans and animals. While a few of the nutrients, notably the trace elements, were reviewed in the first report, the coverage was generally toxicological. The committee examined them as sources of potential risk to human populations.
In view of these considerations, the second Safe Drinking Water Committee established a Subcommittee on Nutrition and charged it with the responsibility of reviewing this area by selecting elements of interest and evaluating the effects of their presence in water. In this report, the subcommittee has examined the concentrations of nutrients in drinking water and the contribution of these concentrations to the observed intake and optimal nutrient requirements of human populations. It studied the benefits of the presence of an element in water and, in cases in which symptoms of both deficiency and toxicity are known to occur, adverse effects. This is a departure from most of the studies of the SDWC conducted previously or in progress, which were or are limited to adverse effects. The subcommittee chose to tide this review The Contribution of Drinking Water to Mineral Nutrition in Humans, focusing on the positive effects of suites of elements that are known or assumed to interact in the environment or in biological systems.
In Drinking Water and Health (National Academy of Sciences, 1977), the committee reviewed eight metals (chromium, cobalt, copper, magnesium. manganese. molybdenum. tin, and zinc) that are essential to human nutrition. The nutritional aspects of others, such as nickel, selenium, arsenic. and vanadium, were not considered. Rather, their toxicity was reviewed. In this study the subcommittee has reviewed potassium. chloride. iron. calcium, phosphorus, and silicon, and has extended the original review only where there was a need for updating or for examining a particular element as a nutrient as opposed to a potentially toxic substance. In the section on fluoride, the subcommittee decided against including an in-depth review because of its uncertainty concerning fluoride's essentially to nutrition. However, in view of the contribution of fluoride to overall dental health and, through this, its effect on total health, some discussion of fluoride has been included.
Chromium has not been dealt with at great length because it is not certain that the nutritionally useful form of the element occurs in water. It is generally thought that cobalt has nutritional value only as a component of vitamin B12. Although some preliminary studies suggest that inorganic cobalt may have a physiological role independent of its function in vitamin B12 (Roginski and Mertz, 1977), cobalt has not been discussed in this chapter.
The subcommittee also examined the difference in water intake between young and adult humans. Infants (7 kg) consume approximately one-third as much water on the average as an adult, but their body weight is only approximately one-tenth of adult weight and their food intake is also obviously lower. For this reason, the water intake of an infant may contribute a significant quantity of a given element (National Academy of Sciences. 1974).
When people consume unusual diets. e.g.. the diets of vegans. who consume no animal foods or dairy products. the intake of certain elements may be significantly different from the average. Athletes or people engaged in heavy labor and those living in a hot climate consume larger than normal amounts of water. In these instances, the contribution of water to the overall nutrient intake may be significantly different from the average.
The contributions from air have been considered only when amounts of possible significance were suspected. Where such contributions were negligible, no comment has been made. Only in rare instances. such as unusually high airborne levels, might air contribute to the nutrient needs of individuals. It is not always known whether elements taken in from such exposures are used for nutritional (metabolic) purposes.
Requirements for nutrients are generally discussed in terms of the recommended dietary allowances (RDA's) (National Academy of Sciences, 1974) or those intakes that have been judged adequate and safe (National Academy of Sciences. 1980)-not minimal intakes necessary for survival.
The subcommittee examined the new literature on water hardness because it involves nutritionally essential elements. However, it found no significant conclusive data concerning the relationship of water hardness and the incidence of cardiovascular disease since Drinking Water and Health (National Academy of Sciences, 1977) was published. An extensive evaluation of the literature in this area has recently been published (National Academy of Sciences, 1979). Therefore, this topic is not covered in this report.
Clearly, some elements that have been reviewed are subject to changes in concentration in water because of the activities of humans. Elements in this category are zinc, copper, molybdenum, tin, manganese, nickel, and vanadium. These may require somewhat closer surveillance than elements such as magnesium whose concentrations in water appear to be little affected by human activity.
The subcommittee believes that a study of the contribution of drinking water to mineral nutrition in humans is essential in a balanced appraisal of drinking water. It also believes that the data in this review are up-to- date and accurate and that they should help those charged with evaluating the nutritional value of drinking water in the United States.
Most information on the mineral composition of water has been gathered from large water-supply systems. In 1975, approximately 35.7 million water consumers (16.7% of the population) were served by systems supplying less than 25 persons. The minerals in water from smaller systems and individual supplies, e.g., wells, may exceed the concentrations in large water supplies which form the basis for most levels cited in this report. Therefore, the potential contributions of water to nutrient intake that are given below must not be taken as the absolute limits.
The interplay between mineral elements and nutrition is exceedingly complex. In this report, it has been considered in light of the best available knowledge, but it should be remembered that this knowledge is still incomplete.
Presence in Food and Water
Dairy products provide the largest source of calcium in the American diet. Table V- 1 lists calcium concentrations for some of these products and other foods (Davidson et al., 1975).
In a survey of U.S. surface waters from 1957 to 1969, calcium levels ranged from 11.0 to 173.0 mg/liter (mean, 57.1 mg/liter) for 510 determinations (National Academy of Sciences, 1977). Finished water that was sampled in public water supplies for the 100 largest cities in the United States contained almost as much calcium (range, 1-145 mg/liter). The calcium concentrations in 93% of the city supplies were less than 50 mg/liter (Durfor and Becker, 1964). Similar results were reported in a Canadian study Neri et al., 1977). Zoeteman and Brinkmann (1977) reported that the public water supplies for 21 large European cities contained between 7 and 140 mg/liter (mean, 85 mg/liter).
The daily intake of calcium for most western adult populations averages between 500 and 1,000 mg (Walker, 1972). The U.S. Health and Nutrition Examination Survey estimated calcium intakes for 20,749 people from 1 to 74 years old, and concluded that the only population segment with an intake appreciably (30%-40%) below the recommended daily allowance was the adult black female. The allowance values used in this survey were 450 mg for children aged 1 to 9 years, 650 mg for ages 10 to 16 years, 550 mg for ages 17 to 19 years, 400 mg for men 20 years and older, 600 mg for women 20 years or older, 800 mg for pregnant women, and 1,100 mg for lactating women (Abraham et al, 1977).
The amount of calcium required by the body daily and the level of dietary calcium needed to meet this requirement are controversial issues. Healthy individuals accustomed to low-calcium diets appear to do as well as similar individuals accustomed to high calcium intakes. To some extent, the daily calcium allowances recommended by various international agencies reflect the calcium levels of normal local diets. In the United States, the Food and Nutrition Board of the National Research
TABLE V-i Calcium Concentrations in Foods and Foodstuffs0
Food Item mg/l00gormg/100 ml
Data from Davidson et al., 1975.
Council (National Academy of Sciences, 1974) has recommended daily calcium intakes of 800 mg/day for adults on the basis that the daily excretion of calcium is 320 mg and that only 40% of dietary calcium is absorbed by the average American. However, the excretion rate and absorption percentage can vary with age and physiological state. The recommended dietary allowances (RDA) of calcium for Americans, then, are 360 mg for infants less than 6 months old, 540 mg for 6- to 12-month-old infants, 800 mg for children aged 1 to 10 years, 1,200 mg for 11- to 18-year-old children, and 800 mg for individuals 19 years and older. During pregnancy and lactation the RDA is increased to 1,200 mg/day.
Toxicity Versus Essential Levels
There is no clearly defined calcium deficiency syndrome in humans. This may be due, in part, to an adaptation in calcium absorption and utilization which varies with calcium intake. In a study of 26 male prisoners ranging in age from 20 to 69 years, Malm (1958) observed that 23 of them achieved calcium balance immediately or within several months after restricting their calcium intakes from 650 or 930 mg/day to approximately 450 mg/day.
The etiology of osteoporosis, a degenerative disease involving loss of bone calcium, is not clear, but prolonged inadequate intakes of calcium may play an integral role. Diets that were deficient in both calcium and vitamin D caused rickets and osteoporosis to develop in rats 6 weeks after they had been started on the diet at weaning. Osteoporosis was reversed when the rats were given a high-calcium diet that still lacked vitamin D (Gershon-Cohen and Jowsey, 1964). When the animals were 2 months old before receiving the low calcium, vitamin-D-deficient diet, osteoporosis resulted without rickets.
Osteoporosis affects a large portion of older people and is most prevalent in older women. Calcium supplements that were given to osteoporosis patients for 2 years did not appear to reverse the calcium loss from bone (Shapiro et al., 1975).
Hypocalcemia due to impaired alimentary adsorption of calcium in newborn children can result in tetany, consisting of twitches and spasms (Davidson et al., 1975, p.645).
Calcium is relatively nontoxic when administered orally. There have been no reports of acute toxicity from the consumption of calcium contained in various foods. Peach (1975a) indicated that calcium intakes in excess of 1,000 mg/day when coupled with high vitamin D intakes can raise blood levels of calcium. An excess of 1,000 mg/day (2.5 times the RDA) for long periods can depress serum magnesium levels. Diets that are high in calcium have also produced symptoms of zinc deficiency in rats, chickens, and pigs after prolonged feeding. Kidney stones in humans have been associated with high calcium intakes (Hegsted, 1957).
Low calcium intakes increase the rat's susceptibility to lead poisoning (Snowdon and Sanderson. 1974), while high intakes of calcium decrease lead absorption from the intestine (Kostial et al, 1971). Recent studies in young children have associated high blood levels of lead with low dietary intakes of calcium. Mahaffey and coworkers (1976) observed that 12- to 47-month-old children with normal concentrations of lead (<0.03 mg/ 100 ml) in their blood had higher levels of dietary calcium (and phosphorus) than did matched children with elevated (>0.04 mg/100 ml) lead levels in their blood. Dietary calcium intake was not reported. Sorrel and coworkers (1977) found concentrations of lead and calcium in blood inversely correlated in control and lead-burdened children aged I to 6 years. For children with high concentrations of lead (>0.06 mg/100 ml blood) average daily calcium intakes were 610 + 20 mg, while children with blood lead concentrations <0.03 mg/100 ml had average daily calcium intakes of 770 + 20 mg. Itokawa et al. (1974) suggested that the bone pain in itai-itai disease in Japan was causally related to diets low in calcium and protein coupled with cadmium poisoning. Low calcium intakes increase the intestinal absorption of cadmium and the deposition of cadmium in bone and soft tissue (Pond and Walter, 1975). Furthermore, cadmium inhibits the synthesis of 1,25-dihydroxycholecal-ciferol by renal tubules (Suda et al., 1973). This hormone facilitates intestinal absorption of calcium (Suda et al., 1974), an especially important function when calcium intake is low. The same or highly similar mechanisms may control the absorption of calcium and magnesium into the bloodstream and their deposition into tissues.
Contribution of Drinking Water to Calcium Nutrition
Using an average calcium concentration in public water supplies of 26 mg/liter and a maximum of 145 mg/liter (Durfor and Becker, 1964) and assuming that the average adult drinks 2 liters of this water daily, then the drinking water could contribute an average of 52 mg/day and a maximum of 290 mg/day. On an average basis this would represent 5% to 10% of the usual daily intake or approximately 6.5% of the adult RDA. For hard waters with high calcium levels, the water would contribute approximately 29% to 58% of the usual daily intake or approximately 36% of the adult RDA. Thus. public drinking water generally contributes a small amount to total calcium intake, but in some instances it can be a major contributor.
Current levels of calcium in U.S. drinking water are well below levels that pose known risks to human health. No upper limit for calcium need be set to protect public health. In cases of dietary calcium deficiencies, the presence of this element in drinking water may provide nutritional benefit.
Presence in Food and Water
Schroeder and coworkers (1969) measured the magnesium contents of a variety of foods and foodstuffs using atomic absorption spectrophotometry. On a wet weight basis, spices nuts, and whole grains had the highest magnesium contents, and refined sugars, human milk, oils, and fats had the lowest. The food data are summarized in Table V-2.
Magnesium and calcium are responsible for most of the hardness of drinking water. In a nationwide study in Canada, the mean concentration of magnesium in finished water before it entered the distribution systems was 10.99 mg/liter. This concentration changed little during distribution (Neri et al., 1977). In the United States, the mean concentration of magnesium in public water supplies in 100 cities was 6.25 mg/liter (range, 0-120 mg/liter). The concentration of magnesium in 96% of the water supplies was <20 mg/liter (Durfor and Becker, 1964). From 1957 to l969, the average magnesium concentration in U.S. surface waters was 14.3 mg/liter (range, 8.5-137 mg/liter) for 1,143 determinations (National Academy of Sciences, 1977).
In the United States, the average adult ingests between 240 and 480mg of magnesium daily (Wacker et al., 1977). Approximately 60% to 70% of this is excreted in the feces. The British diet is reported to provide 200 to 400 mg of magnesium daily (Davidson et al, 1975).
The daily need for dietary magnesium is a function of the amounts of calcium, potassium, phosphate, lactose, and protein consumed. For the average healthy American on an average diet, the daily magnesium intake recommended by the Food and Nutrition Board of the National Research Council (National Academy of Sciences, 1974) is 60 mg for infants less that 6 months old, 70 mg for 6- to 12-month-old infants, 150 mg for 1- to 3-year-old children, 200 mg for 4- to 6-year-old children, 250 mg for 7- to 10-year-old children, and 300 mg for females 11 years and older. For adolescent and adult males the recommended dietary allowances (RDA's) are 350 mg for ages 11 to 14, 400 mg for ages 15 to 18 years, and 350 mg for those 19 years of age and older. The RDA for pregnant and lactating women is 450 mg.
TABLE V-2 Magnesium Concentrations in Foods and Foodstuffsa4
Food Item or Range Mean
Condiments, spices 2304,225 2,598
Nuts 1,078-3,175 1,970
Grains and cereal products 18-2,526 805
Fish and seafood 154-532 348
Meats 195402 267
Vegetables, fresh legumes 185-297 241
Fresh roots 75-478 226
Fresh fleshy 66487 174
Fresh leafy 85-321 170
Dairy products, eggs 102-270 183
Fruits and juices 102-270 78
Sugars and syrups 0.1-108 59
Milk, human 28-29 29
Oils and fats 1-27 7
Coffee 48 NRb
Tea 3-11 NR
Whisky, gin 0.3-4.5 NR
Wine, white 98 NR
Beer 100 NR
Vermouth, Italian 135 NR
Data from Schroeder et al., 1969.
b NR, not reported.
Toxicity Versus Essential Levels
Despite several studies. magnesium deficiency in humans is still not well defined, primarily because it has been studied in individuals also suffering from other metabolic and physiological disorders. Electrolyte imbalance, especially for calcium and potassium. is characteristic of magnesium deficiency.
Magnesium deficiency is most often observed in patients with gastrointestinal diseases that lead to malabsorption and in those with hyperparathyroidism, bone cancer, aldosteronism, diabetes mellitus, and
thyrotoxicosis (Wacker and Parisi, 1968). Alcohol can deplete magnesium levels in heavy drinkers by apparently increasing renal loss. These heavy drinkers show extensive neuromuscular dysfunction such as tetany, generalized tonic-colonic and focal seizures, ataxia, vertigo, muscular weakness, tremors, depression, irritability, and psychotic behavior. By giving them magnesium. these dysfunctions can be reversed (Wacker and Parisi. 1968).
In the rat, prolonged magnesium deficiency retards growth and results in loss of hair. skin lesions. edema, and degeneration of the kidney (Kruse et al., 1932).
Because magnesium is rapidly excreted by the kidney, it is unlikely that magnesium in food and water is absorbed and accumulated in tissues in sufficient quantities to induce toxicity. Magnesium salts are used therapeutically as cathartics, e.g., magnesium sulfate (MgSO4), hydroxide [Mg(OH)2]. and citrate (Mg3][OOCCH2COH(COO)CH2COO]; as antacids. e.g.. magnesium hydroxide. carbonate [Mg(C03)]. and trisilicate (Mg2085i3); and as anticonvulsants to control seizures associated with acute nephritis and with eclampsia of pregnancy (magnesium sulfate). In patients with renal disease and impaired magnesium excretion. large excesses of magnesium can lead to severe toxicity resulting in muscle weakness, hypotension. sedation, confusion, decreased deep tendon reflexes, respiratory paralysis. coma, and death. At plasma concentrations exceeding 9.6 mg/100 ml (8 mEq/liter) central nervous system depression is evident. Anesthesia is reached near 12 mg/100 ml (10 mEq/liter). and paralysis of skeletal muscle can be produced at plasma concentrations of approximately 18 mg/100 ml (15 mEq/liter) (Peach. 1975a). Normal values are I to 3 mg/100 ml (0.8 to 2.5 mEq/liter). Calcium ameliorates magnesium toxicity.
The interactions of trace elements in nutrition were reviewed in Drinking Water and Health (National Academy of Sciences, 1977). The metabolism of magnesium is tied closely to that of calcium and potassium. Magnesium deficiency results in potassium loss, probably due to the interaction of magnesium and phosphate in the active transport of potassium and sodium across cell membranes. The release of parathyroid hormone. calcitonin. and 1,25-dihydroxycholecalciferol. which are hormones that govern calcium and phosphorus metabolism, is reduced
by lowered magnesium intakes. The mechanism for this reduction is not understood.
Contribution of Drinking Water to Magnesium Nutrition
Using the magnesium concentrations reported by Durfor and Becker (1964) for U.S. drinking waters (median, 6.25 mg/liter; maximum, 120 mg/liter), a daily intake of 2 liters of drinking water would supply an average of approximately 12 mg of magnesium and a maximum of up to 240 mg. For Canadian (Neri et al., 1977) and Western European (Zoeteman and Brinkmann, 1977) drinking waters the daily contribution would be approximately 20 and 24 mg of magnesium, respectively. Therefore, typical drinking water in the United States, Canada, or Europe provides approximately 3% to 7% of the RDA for magnesium intake by a healthy human. In areas where the magnesium concentration is high, over 50% of the RDA could come from 2 liters of water (see Table V-32). Thus, drinking water could provide a nutritionally significant amount of magnesium for individuals consuming a diet that is marginally deficient in magnesium, especially in areas where the magnesium concentration in water is high.
Current levels of magnesium in U.S. drinking water supplies appear to offer no threat to human health, and no upper limit for magnesium concentrations needs to be set to protect public health. For individuals consuming a magnesium-deficient diet, the presence of this element in drinking water may provide nutritional benefit.
Presence in Food and Water
Phosphorus, in the form of phosphate, is common to most foods and foodstuffs. In foods of plant origin, phosphorus concentrates in seeds. Nuts, beans. and whole grains contain high levels of phosphorus, whereas leafy vegetables contain low levels. Fruits contain little phosphorus, but meat and fish are relatively rich in the mineral. Table V-3 summarizes the phosphorus contents of some of the foods listed by Sherman (1952).
Data collected on the average daily consumption of soft drinks in the
United States are summarized in Table VA. The estimates for phosphorus intakes from soft drinks indicate that such products contribute little to the phosphorus intakes for the general public. Bell and coworkers (1977) indicated that high phosphorus diets might include as much as 100 mg of phosphorus per day from soft drinks for adults.
The average daily intake of phosphorus in the United States and the United Kingdom is approximately 1,500 mg (Davidson et al., 1975, p.645; National Academy of Sciences, 1974). Approximately 70% of the ingested mineral is absorbed as the free phosphate (Hegsted, 1973).
Most municipal drinking waters contain little phosphorus. Using
spectrographic analysis Durfor and Becker (1964) determined that 92%
of the public water supplies of the 100 largest U.S. cities had
undetectable levels of phosphorus. Zoeteman and Brinkmann (1977)
reported that the public water supplies of 12 large cities in Europe had a
mean phosphate concentration of 0.32 mg/liter (0.10 mg of phosphorus)
and a maximum of 3.0 mg/liter (1.0 mg of phosphorus).
A survey of U.S. rivers and lakes from l%2 to 1967 indicated that 747 of the 1,577 water samples that were analyzed contained phosphorus. The mean concentration of phosphorus was 0.12 mg/liter, and the maximum was 5.04 mg/liter (Kopp and Kroner, 1967).
TABLE V-3 Phosphorus Concentrations in Foods and
Food Item Concentration, rng/1O0 g
Cheese, hard 610
Beans, dry 128-586
Grains, whole 303-405
TABLE V-4 Estimates of Phosphorus Intakes from Soft Drink Consumption by Age Group
Average Soft Drink Average Cola Intake from
Age, Consumption, Consumption, Cola Drinks,
yr ml/daya ml/dayb mg/dayc
1-2 68 42 3
3-5 111 69 5
144 89 7
9-11 165/167 102/104 8
12-14 187/229 116/142 9/11
15-17 240/285 149/177 11/13
18-19 246/314 153/195 12/15
20-34 191/229 118/142 9/11
35-54 99/115 61/71 S
55-64 65/75 40/47 3/4
65-74 41/46 25/29 2
75+ 27/38 17/24 1/2
a Values for ages over 9 years are average values for females (10wer figures) and males (higher figures). Data from U.S. Department of Agriculture, I%5.
b Assuming that approximately 62% of all soft drinks were colas. Based on a report by the National Soft Drink Association, 1978.
Assuming the average concentration of phosphorus in colas is 0.076 mg/g drink. Based on a report by Houston and Levy, 1975.
Except for the infant, the daily allowance of phosphorus recommended by the National Research Council (National Academy of Sciences, 1974) is the same as that for calcium. As long as the diet contains sufficient vitamin D. the ratio of calcium to phosphorus can vary considerably. However, the ratio for the infant should be close to 1.5:1 to guard against the possible occurrence of hypocalcemic tetany during the first weeks of life (Mizrahi et al.. 1968). The recommended dietary allowance (RDA) is 240 mg of phosphorus for infants less than 6 months old and 400 mg for infants between 6 and 12 months old: for children aged 1 to 10 and adults 19 years or more, the RDA is 800 mg. Children between the ages of 11 and 18 years and pregnant and lactating women should consume 1,200 mg phosphorus daily (National Academy of Sciences.
Toxicity Versus Essential Levels
Dietary deficiency of phosphorus is not known to occur in humans because of the widespread presence of the mineral in foods. Excessive use of nonabsorbable antacids can induce phosphorus depletion, which causes weakness, anorexia, and bone pain. Familial hypophosphatemia is attributed to defective absorption of the phosphate ion (P04) from the intestine or to defective reabsorption from the renal tubules. It is characterized by rickets and dwarfism (Glorieux et al., 1972; Short et al., 1973). There may also be decreased concentrations of erythrocyte adenosine triphosphate (ATP) and 2,3-diphosphoglycerate. In severe hypophosphatemia acute hemolytic anemia can also occur (Jacob and Amsden, 1971; Lichtman et al., 1969).
Sodium orthophosphate (Na3PO4) is poorly absorbed and relatively nontoxic. Acute iatrogenic poisoning with inorganic pyro-(Na4P2O12) or meta-(Na4P4O12) phosphate salts can inhibit calcium utilization and produce nausea, diarrhea, gastrointestinal hemorrhages and ulcerations, and cellular damage in the kidney and the liver. Mazess and Mather (1974) suggested that the high phosphate content of the diet of Eskimos may contribute to the development of osteoporosis, but this has not been confirmed. Rats fed a 5% phosphorus (as NaH2PO4) diet for 20 to 30 days develop renal damage. This level is 10 times the level of dietary phosphorus thought to be necessary for adequate nutrition for the rat (Duguid 1938).
Cations that form insoluble phosphates interfere with the absorption of phosphorus. For example. high intakes of aluminum decrease absorption of phosphorus (as phosphate) by forming insoluble aluminum phosphate (AlPO4) and increasing the excretory loss of phosphorus (Ondriecka et al., 1971).
Contribution of Drinking Water to Phosphorus Nutrition
Because public drinking waters contain little phosphorus (#ug/liter concentrations) and because foods provide more than I g of phosphorus
per day, it can be concluded that phosphorus levels in drinking water contribute only negligibly to human requirements for this mineral.
There is no nutritional basis for the regulation of phosphorus levels in U.S. drinking water supplies.
Scientific issues relating to fluoride in drinking water have been adequately defined in Drinking Water and Health (National Academy of Sciences, 1977). Fluoride is included in the present report only to provide complete coverage of the nutritional aspects of drinking water.
Presence in Food and Water
Fish (especially bones) and fish products are often high in fluoride. Tea is high in fluoride (a few hundred mg/kg) and two-thirds of the mineral is extracted into the infusion (Harrison, 1949). Cholak (1959) reported the fluoride concentrations in fresh foods (Table V-S).
In 1962, most public water supplies of the 100 largest U.S. cities contained fluoride, according to a survey reported by Durfor and Becker (1964). Ninety-two percent of these supplies contained less than I mg/liter (median, 0.4 mg/liter; maximum, 7.0 mg4iter). Of the 969 water supplies sampled in the Community Water Supply Survey of the Public Health Service (U.S. Department of Health. Education, and Welfare, 1969), the fluoride contents ranged from 0.2 to 4.40 mg/liter. Fleischer and colleagues (1974) reported the fluoride contents of a variety of groundwaters: rivers contained 0.0 to 6.5 mg/liter; lakes contained up to 1,627 mg/liter; various groundwaters contained 0.0 to 35.1 mg/liter; and seawater had an average concentration of 1.2 mg/liter.
Osis et al (1974) determined fluoride intakes for a variety of diets in Chicago with and without fluoridation of the drinking water supply. They reported that the average daily intake of fluoride was 1.6 to 1.9 mg when the drinking water was fluoridated and approximately half this when the water was not. These values do not include the contribution made by the consumption of drinking water directly but do include that added by water used for cooking. In other areas in the United States dietary intakes of fluoride ranged from 1.73 to 3.44 mg/day, and intakes of fluoride from water ranged from 0.53 to 1.27 mg/day. In four
a Data from Cholak, 1959.
Fluoride Concentrations in Foods and
unfluoridated areas, the diet contributed 0.78 to 1.03 mg of fluoride per day and the drinking water added 0.08 to 0.44 mg/day (Kramer et al., 1974).
Wiatrowski et al. (1975) reported that the total daily fluoride intake was 032 mg for infants aged 1 to 4 weeks, 0A7 mg for ages 4 to 6 weeks,
0.57 mg for ages 6 to 8 weeks, 0.71 mg for ages 2 to 3 months, 1.02 mg for ages 3 to 4 months, and 1.23 mg for infants between the ages of 4 and 6 months.
The Food and Nutrition Board of the National Research Council has not previously recommended a daily intake of fluoride (National Academy of Sciences, 1974), but has recently estimated adequate and safe intakes of 0.1 to 0.5 mg fluoride for infants less than 6 months of age, 0.2 to 1.0 mg for infants between 6 and 12 months, 0.5 to 1.0 mg for children between the ages of 1 and 3 years, 1.0 to 2.5 mg for 4- to 6-year-old children, 1.5 to 2.5 mg for children from 7 years to adulthood, and 1.5 to 4.0 mg for adults (National Academy of Sciences, 1980). These levels are considered to be protective against dental caries and osteoporosis (Mertz, 1972).
Toxicity Versus Essential Levels
Fluoride has not been shown unequivocally to be an essential element for human nutrition, except for its effectiveness in reducing the incidence of dental caries. Reports of the depression of growth in rats (Schwarz and Mime, 1972) and progressive infertility in mice (Messer et al., 1972, 1973) as consistent responses to fluoride-deficient diets have not been confirmed (Tao and Suttie, 1976: Wegner et al, 1976). The role of fluoride in dental health has been demonstrated in humans (Dean et al, 1941; Hodge, 1950). The incidence of dental caries was associated with low-fluoride diets, and inhibition of caries was observed in subjects who drank water containing > 1.3 mg4iter of fluoride. As water intake varies with ambient temperature. so does the ingestion of fluoride. Thus, in warm climates a lower concentration of fluoride in the drinking water may be sufficient to reduce caries.
The fluoride concentration in drinking water is not critical for caries protection. Rather, it is the amount of fluoride consumed during the tooth-forming years. As the uptake and deposition of fluoride are greatest before eruption and calcification of the teeth, its anticariogenic effect is greatest with children, especially those less than 8 years old.
The acute and chronic toxicity of fluoride in humans was reviewed in Drinking Water and Health (National Academy of Sciences, 1977).
Acute poisoning by fluoride is rare in humans. Peach (1975b) estimated that a lethal dose for an adult human is approximately 5 g as sodium fluoride (NaF). The response to ingested fluoride is swift. It acts directly on the gastrointestinal mucosa causing vomiting, abdominal pain, diarrhea. convulsions. excessive salivation, and paresthesia. It also disrupts calcium-dependent functions.
Ingestion of drinking water containing excessive fluoride can result in mottling of the teeth and dental fluorosis in children. Increased density and calcification of bone (osteosclerosis) has been associated with chronic ingestion of high-fluoride water (Hodge and Smith. 1965). At unusually high levels. chronic fluoride ingestion can result in crippling skeletal fluorosis. Several studies have been conducted to determine the exact levels of fluoride at which these adverse effects occur. but the results often conflict due to lack of control or failure to account for various parameters in the study populations. Dental mottling and
changes in tooth structure may develop in a few children when fluoride levels in water exceed approximately 0.7 to 1.3 mg/liter, depending on ambient temperature (Richards et al., 1967) and diet. Roholm (1937) estimated that a 10- to 20-year daily ingestion of 20 to 80 mg fluoride could result in crippling skeletal fluorosis.
Calcium and aluminum salts decrease the absorption of fluoride from the intestinal tract. In sheep and rats magnesium salts are somewhat less effective (Underwood, 1977; Weddle and Muhler, 1954). In studies of humans, Spencer and coworkers (1977) demonstrated that ingestion of antacids containing aluminum hydroxide [Al(OH)3] increased fecal excretion of fluoride by as much as 12 times, resulting in decreased absorption and lowered plasma levels of fluoride. On the other hand, increasing calcium and phosphorus intake did not affect fluoride balance, although these latter minerals, as well as magnesium, did increase fecal excretion of fluoride.
Contribution of Drinking Water to Fluoride Nutrition
Kramer and colleagues (1974) measured the fluoride content of meals and water samples from 12 cities with fluoridated drinking water and four cities without fluoridation. From these data (Table V-6), the contribution of drinking water to individual diets can be estimated if 2 liters/day consumption is assumed. These data demonstrate that drinking water, whether artificially fluoridated or not, can make an important contribution to the total daily fluoride intake. In fluoridated areas, the contribution ranges from 25.9% to 53.5% of the total intake. In unfluoridated areas, it ranges from 13.5% to 48.1%.
Recommendations and Conclusions
Human populations should be studied in detail to determine more precisely the levels of fluoride intake (total and from drinking water) that may be causally related to dental fluorosis and osteosclerosis.
Concentrations of fluoride in drinking water that are recommended for anticariogenic effects appear to be below levels that have been associated with adverse effects in the general U.S. population. Until more precise measures of the margin of safety for the use of fluoride are available, the levels of fluoride in drinking water should not exceed the optimal levels for anticariogenic benefits.
TABLE V-6 Contribution of Drinking Water to Total Daily Fluoride Intake
Fluoride in Fluoride in
Dietary, Water, mg/2 Water to Total
City mg/day liters daily Fluoride Intake, %
Martinez, Calif 1.73 1.62 48.4
Chicago, 111. 1.97 1.90 49.1
Louisville, Ky. 1.98 2.28 53.5
St. Louis, Mo. 2.10 1.82 46.4
New York, N.Y. 2.55 1.76 40.8
Durham, N.C. 2.62 1.06 28.8
Lexington, Ky. 2.84 2.30 44.8
Madison, Wis. 2.88 2.22 43.5
Tuscaloosa, Ala. 2.94 1.52 34.1
Cleveland, Ohio 3.05 2.54 45.4
Milwaukee, Wis. 3.41 1.70 33.3
Corvallis, Ore. 3.44 1.20 25.9
Birmingham, Ma. 0.78 0.16 17.0
Chicago, 111. 0.86 0.66 43.4
Houston, Tex. 0.95 0.88 48.1
Iron Mountain, Mich. 1.03 0.16 13.5
4Derived from Kramer et al., 1974.
Sodium is the most abundant cation of those found in the extracellular fluid. The sodium ion is essential to the regulation of the acid-base balance and is a very important contributor to extracellular osmolarity. It functions ill the electrophysiology of cells and is required for the propagation of impulses in excitable tissues. Furthermore, sodium is essential for active nutrient transport including the active transport of glucose across the intestinal mucosa (Harper et al., 1977).
Presence in Food and Water
The total intake of sodium is influenced mainly by the extent that salt (sodium chloride) is used as an additive to food. the inherent salt content of the foods consumed, and the intake of other sodium salts in the diet
TABLE V-7 Average Sodium Content by Commodity Groups in Adult Market Basketsa
1977 (25 Baskets) 1978 (8 Baskets)
Commodity Groups mg/day SEb % of Total mg/day SE % of Total
Dairy products 704 (f 15.3) 10.5 792 (+. 19.8) 11.4
Meat, fish, and poultry 952 (f 41.2) 14.2 921 (f 79.2) 13.3
Grain and cereal products 2,005 (f 84.5) 29.9 2,002 (f123.l) 28.9
Potatoes 75 (f 6.2) 1.1 82 (f 10.6) 1.2
Vegetables, leafy 22 (i 1.8) - 22 (f 2.3) -
Vegetables, legume 243 (f 8.5) 3.6 258 (f 13.8) 3.7
Vegetables, roots 18 (i 1.0) - 17 (i 3.0) -
and vegetable products 285 (f 12.8) 4.2 284 (i 42.2) 4.1
Fruits 66 (+ 5.8) 1.0 75 (f 14.7) 1.1
Oils, fats, and shortenings 387 (i 17.1) 5.8 406 (f 30.9) 5.9
Sugar and adjuncts 1,923 (i 80.4) 28.7 2,042 (f123.8) 29.5
Beverages, including 17 (f 2.4) - 27 (f 6.6) -
TOTAL 6,697 (i130.6) 6,928 (i206.8)
Data from Shank, 1978.
b SE, standard error.
and in medications. Sodium is a natural constituent of both vegetable and animal products in varying concentrations.
In addition to salt, rich dietary sources of sodium are sodium-containing condiments such as monosodium glutamate, sauces, relishes, sweet and sour pickles, gherkins, olives, tomato ketchup, and a number of other foods including ham, bacon, sausages. dried beef, cold cuts, frankfurters, anchovies, canned crab, canned tuna, other canned fish, cheese, canned vegetables. sauerkraut, potato chips. other salted snack foods such as pretzels. saltines, soda crackers, breakfast cereals, and breads such as cornbread or biscuits. The average sodium content of foods analyzed in the Food and Drug Administration's Total Diet Study (market-basket survey) for 1977 and 1978 is shown in Table V-7 for typical adult intakes.
During the preservation and processing of foods, sodium and sodium chloride (NaCI) are added as are numerous other chemical additives including sodium saccharin (C7H4NO3SNa), monosodium glutamate [HOOCCH(NH2)CH2CH2COONa]. sodium nitrite (NaNO2), sodium nitrate (NaNO3). sodium benzoate (C6H5COONa), sodium ascorbate (C6H7NaO6), sodium proprionate (CH3CH2COONa). sodium caseinate, etc. Sodium chloride or monosodium glutamate may be added to foods to suit individual tastes-not only during the commercial preparation of food but also in the home either in the kitchen or at the table. Other sources of sodium are medications, drinking water, cooking water, soft drinks, and alcoholic beverages (Newborg, 1969; Weickart, 1976). Sodium intake from carbonated beverages may be more than 200 mg/day (Table V-8).
Intake of sodium chloride by American males averages an estimated 10 g/day (range. 4-24 g) (Dahl, 1960). On the basis of these estimates, which were obtained from excretion data, sodium intake would range from 1,600 to 9.600 mg/day. The average sodium intake per capita per day. which was estimated from analysis of hospital diets. has been given as 3,625 mg + 971 (SD). This figure is for diets "as selected" and not "as eaten" and does not include salt that may have been added at the table (California State Department of Public Health, 1970). Sodium intake by infants depends on milk source, formula composition, and the amount of salt added as seasoning by the preparer.
The sodium content of drinking water is extremely variable. Greathouse and Craun (1979) reported that levels of sodium in household tap waters were above detection limits in 99.79% of the areas sampled. The maximum sodium concentration was approximately 80 mg/liter, the minimum was approximately 4.0 mg/liter. and the mean concentration was approximately 28 mg/liter (Craun et al, 1977).
Raw water samples were analyzed by the U.S. Environmental Protection Agency Region V Laboratory and Indiana State University (U.S. Environmental Protection Agency, 1975) during 1971-1975. The sodium content was between 1.1 and 77.0 mg/liter. Finished drinking water samples, which were also analyzed by these same groups, had sodium contents of 1.0 to 91.0 mg/liter.
In a survey of community water systems undertaken as a cooperative study by the New York State Department of Health (1 977) and the U.S. Department of Interior, Geological Survey, approximately 12% of the water samples analyzed contained sodium concentrations in excess of 20 mg/liter. The highest concentration of sodium in drinking water recorded in this survey was 220 mg/liter in two systems-one in Chemung County and one in Wayne County. The report of this study pointed out that sodium is added during the treatment of public water supplies as follows:
1. Ion exchange softening-sodium ion in an exchange medium. Sodium, in the forms of sodium chloride, sodium carbonate (Na2CO3), and sodium hexametaphosphate [NaPO3)x], is exchanged for magnesium and calcium.
2. Chemical precipitation softening-sodium carbonate (soda ash).
3. Disinfection (chlorination)-sodium hypochlorite (NaOCl).
4. Fluoridation-sodium fluoride (NaF) or sodium silicofluoride (Na2SiF6).
5. Corrosion control-sodium carbonate or sodium hexametaphosphate.
7. Coagulation-sodium aluminate (NaAlO2) or sodium silicate (NaSiO3).
8. Dechlorination-sodium bisulfite (NaHSO3).
Many residences with public water or individual well water supplies also have water-softening units that can add sodium to the water. In most home water-softening units, the ion-exchange process is used. This process increases the sodium concentration in the finished water since it adds two atoms of sodium for every atom of calcium removed.
To summarize the distribution of sodium in New York State water
supplies, 265 community water systems had a sodium content between
5.01 and 20.00 mg/liter, 57 such systems contained between 20.01 and
50.00 mg/liter, and 32 systems, which served a total of 22,080 people,
contained more than 50.01 mg/liter (New York State Department of
TABLE V-8 Sodium Content of Soft Drinksa,b
Na+1240- per Capita
Proprietary ml Na+/360- from Soft
Name and Type Na+/ml Serving ml Can DrinksC
7-Up, regular 0.09 22 32 31
7-Up, sugar-free 0.14 32 48 47
Coca-Cola 0.08 20 30 29
Sprite, regular 0.18 42 62 61
Sprite, without sugar 0.18 42 62 61
Mr. PiBB, regular 0.10 23 35 34
Mr. PIBB, without
sugar 0.16 37 55 54
Fanta, orange 0.09 21 31 30
Fanta, grape 0.09 21 31 30
Fanta, root beer 0.10 23 35 34
Fanta, gingerale 0.13 30 46 45
Fresca 0.24 57 85 83
Tab 0.13 30 46 45
Tab, black cherry 0.20 48 72 71
Tab, rootbeer 0.18 42 62 61
Tab, ginger ale 0.20 47 71 70
Tab, orange 0.18 43 65 64
Tab, grape 0.19 44 65 65
Tab, lemon-lime 0.18 42 62 61
Tab, strawberry 0.17 39 59 58
4Values for sodium during manufacture.
content of soft drinks vary with sodium content of the water that is used
b Information derived from the Consumer Information Center, Coca Cola Company, Atlanta, Georgia, and the Seven-Up Company.
Based on 1977 estimate of per capita annual consumption of soft drinks at 359 12 oz units
(98% of a 12 oz can/day). Personal communication from the National Soft Drink Association, Washington, D.C. Sales Survey of the Soft Drink Industry.
The estimated adequate and safe intakes for sodium range from 1,100 to 3,300 mg/day for normal adults or I g of sodium per kilogram of fluid and food intake (Meneely and Battarbee, 1976; National Academy of
Sciences, 1980). In infants, estimated adequate and safe intakes are approximately 115 to 750 mg/day (National Academy of Sciences, 1980).
Toxicity Versus Essential Levels
Salt poisoning or acute sodium toxicity is produced by massive overload, particularly in the very young. In Broome County, New York, 6 out of 14 infants exposed to a sodium concentration of 21,140 mg/liter died when salt was mistakenly used in place of sugar in their formulas (Finberg et al., l%3). Acute toxicity from sodium chloride in healthy adult males accompanied by visible edema may occur with an intake of 35 to 40 g of salt per day (Meneely and Battarbee, 1976).
Evidence suggests that chronic excessive intake of sodium may be associated with hypertension, which is defined on the basis of blood pressure readings, i.e., mild hypertension implies diastolic blood pressure above 90 mm Hg and systolic blood pressure above 140 mm Hg (Reader, 1978). In populations residing in different geographic areas, a positive correlation has been found between salt intake and the incidence of hypertension (Dahl, 1972; Sasaki, 1962).
Sodium toxicity leading to hypertension has been associated with intakes of salt (NaCl) greater than 30 g/day, but may occur at lower intakes in persons predisposed to hypertension or suffering from hypertension, congestive heart failure, cirrhosis, or renal disease. Toxicity of sodium salts may be influenced by the anion with which sodium is paired (Venugopal and Luckey, 1978). Until recently, the failure to demonstrate a causal relationship between salt intake and the development of hypertension has discouraged scientists from recommending limitation of salt intake according to the Advisory Panel of the British Committee on Medical Aspects of Food Policy (Nutrition) on Diet in Relation to Cardiovascular and Cerebral Vascular Disease
The level of sodium in drinking water may influence blood pressure. The blood pressure distribution patterns for systolic and diastolic pressures of high school students living in a community with elevated levels of sodium in the drinking water (100 mg/liter) showed a significant upward shift as compared with the patterns for matched students in a
neighboring community with low sodium levels in the drinking water (8 mg/liter) (Calabrase and Tuthill, 1977).
Twenty percent of the adult U.S. population has hypertension (Intersociety Commission for Heart Disease Resources, 1971). In the treatment of essential hypertension, restriction of dietary sodium leads to a fall in blood pressure (Allen and Sherrill, 1922). Dahl et al. (1958) reported that weight reduction in obese individuals leads to a decrease in blood pressure only when sodium intake is restricted. On the other hand, sodium restriction in the obese significantly reduces blood pressure even when calories are not restricted.
In hypertensive subjects, a lowering of blood pressure may be effected either by reducing total sodium intake or by the use of thiazide diuretics, which promote sodium excretion. In 1960, the American Heart Association (AHA) reported that diuretics could reduce the need for a very restricted sodium diet and that they produced quick results, a desirable factor when there is an acute need for lowering blood pressure. The AHA advocated a sodium-restricted diet for the long-term management of hypertension (Pollack, 1960).
A maximum level of sodium in drinking water of 20 mg/liter has been suggested by the American Heart Association (1957).
Currently. antihypertensive medication, including diuretics, is considered a requirement in the management of established hypertension. Sodium restriction alone can control borderline hypertension thereby reducing the need for diuretics (American Medical Association, 1973; Mayer, 1971).
According to Freis (1976), the consumption of less than 5 g of salt a day might reduce the incidence of hypertension and its related diseases by 80% but the evidence available to support such specific figures is limited. Blackburn (1978) suggested a public health intervention trial "to test . . . reduced sodium consumption and culture-wide changes in salt-eating habits as a long-term public health approach to primary prevention of hypertension."
Sodium-restricted diets are required in the treatment of congestive cardiac failure, renal disease, cirrhosis of the liver, toxemia of pregnancy, and Meniere's disease. Sodium-restricted diets may also be required for patients on prolonged corticosteroid therapy. Moderate sodium restriction has been advocated for the management of premenstrual fluid retention (Wintrobe et al.. 1970).
Clinical experience has shown that patients often do not adhere to prescribed sodium-restricted diets. However, it is difficult to check compliance from dietary records since patients either forget that certain foods contain appreciable amounts of sodium or are unaware of sodium
sources in foods, drugs, or water. Pietinen et al. (1976) proposed the use
of mean urinary sodium values (mean of three 9-hr overnight urinary sodium estimations) to check compliance with dietary suggestions.
In Australia, Morgan et al. (1978) treated patients with mild hypertension by moderately restricting their salt intake, which was monitored by checking urinary sodium values. Results were compared with those of a control group and a group maintained on antihypertensive medication. Salt restriction reduced the diastolic blood pressure by
7.3 4' 1.6 mm Hg, a result similar to that of patients treated with the antihypertensive drugs. In the untreated group, the diastolic blood pressure rose by 1.8 4' 1.1 mm Hg. The authors pointed out that most patients did not achieve the amount of salt restriction their physicians desired and inferred that stricter adherence to the diet could have caused further reductions in blood pressure. To effect restriction of sodium intake, they recommended the avoidance of both salted foods and addition of salt to food at the table. However, they acknowledged that it would be difficult to reduce the intake of sodium below approximately 2,300 mg/day in Australia because of the widespread use of sodium salts in prepared foods (Morgan et al., 1978). This is also probably true in the United States. (See Table V-7.)
Elliott and Alexander (1961) reported adverse health effects in persons on sodium-restricted diets when they consumed water with a high sodium content. The authors observed recurrent episodes of heart failure which ceased when water with a low sodium content was substituted.
Sodium deficiency may result from renal disease, diuretic therapy, osmotic diuresis, adrenal insufficiency, vomiting, diarrhea, wound drainage, excessive sweating, burns, mucoviscidosis (cystic fibrosis), peritoneal drainage, and pleural, pancreatic, and biliary fistulae drainage. Salt restriction leads to sodium deficiency under conditions of renal impairment.
Sodium depletion can be either iatrogenic or noniatrogenic. Iatrogenic causes include administration of excessive amounts of free water, thiazides, other diuretics, such as furosemide and ethacrynic acid, barbiturates, and oral hypoglycemic drugs (de Bodo and Prescott, I 945; Fichman et al., 1971; Fuisz, 1963; Stormont and Waterhouse, 1961).
Noniatrogenic causes of hyponatremia are commonly related to the inability to excrete free water which is either administered or generated from endogenous sources. Hyponatremia can also be induced by poor
secretion of antidiuretic hormones, alcoholic cirrhosis, adrenocortical insufficiency, congestive heart failure, and cachexia (Bartter and Schwartz, 1967; Birkenfeld et al., 1958; Kleeman, 1971; Matter et al., 1964).
Drinkers of large quantities of beer who eat very little food can experience fatigue, dizziness, and muscle weakness. These patients are both hyponatremic and hypokalemic. The syndrome is rapidly resolved by abstaining from beer consumption and eating a normal diet. Hilden and Svendsen (1975) stated that beer is often low in sodium (20-50 mg/liter), that the patients did not obtain adequate sodium or potassium from their diets, and that water diuresis is inhibited and hyponatremia develops when animals or humans are kept on a sodium-deficient regimen.
Arieff et al. (1976) discussed the neurological symptoms of hyponatremia. In addition to nonneurological symptoms (anorexia, nausea, vomiting, and muscle weakness), 14 patients with acute hyponatremia had some depression of the sensorium and four of them had grand mal seizures. Seven of these patients were treated with hypertonic saline while four were treated only with fluid restriction. Of the seven patients that were treated with hypertonic saline, five survived. Three of the four patients treated with fluid restriction died. The authors emphasized that edema of the brain that may occur with hyponatremia may go undiagnosed.
Sweat is a major route for sodium losses. Consolazio et al. (1963) studied three normal young men who were made to sweat by daily exercise on a stationary bicycle at temperature of 240C or 37.80C. Sweat was collected in arm bags made of polyethylene. The percentage of total sodium excretion that was excreted in sweat was 62.8% in 7.5 hr and 88.7% in 16.5 hr. The sodium content of sweat varies from approximately
500 to 1,200 mg/liter. Lower values result if subjects have been acclimatized to heat. Lee (1964) reported that excessive sweating, which can be brought on by exercise, heat, or fever, can result in sodium losses from the skin as high as 7 g/day. He suggested that sodium chloride be administered to prevent hyponatremia whenever more than a 4-liter intake of water is required to replace sweat losses. Furthermore, he recommended that 2 g of sodium chloride be given for each additional liter of water lost. This would amount to approximately 7 g/day for people doing heavy work in extreme heat (Lee, 1964). People who are exposed to high temperatures occupationally or during leisure and those performing heavy physical work may find it convenient to take salt pills or to add sodium chloride to their drinking water.
Potassium chloride (KCl) counteracts the hypertensive effects of chronic excess of sodium chloride. Lithium from lithium carbonate (Li2CO3) accumulates in the body of sodium-depleted persons (Meneely and Battarbee, 1976).
Contribution of Water to Sodium Nutrition
Given an intake of 2 liters of drinking water per day with a mean sodium concentration of 28 mg/liter, a contribution of sodium in water to the estimated adequate and safe intake would be approximately 1.7% to 5.0%. The contribution of sodium at 28 mg/liter in water to the observed current dietary sodium intake is between 0.6% and 3.4%. (These figures do not take into account sodium sources from medication, and they refer to adults only.)
Table V- 10 (see page 308) summarizes information on sodium in water and the diet.
Data suggest that whereas health benefits could accrue to certain segments of the population from reduction in sodium intakes, the amount of sodium contributed to the intake from drinking water is small except for persons on sodium-restricted diets (<2,000 mg/day). According to the National Center for Health Statistics, approximately 2.8% of Americans are on low sodium diets (National Academy of Sciences, 1977). The size of the population that is predisposed toward hypertension when exposed to elevated sodium intake is not known with any certainty.
Options available to reduce sodium intake are (in order of decreasing potential) reducing salt added to food as seasoning when eating or cooking, consuming foods with lower sodium levels, reducing sodium in drugs and additives, and reducing sodium levels in water.
Research is needed to define more exactly the contributions of sodium intake, sodium-potassium intake ratios, and other physiological factors to the development of hypertension.
The relationship of the level of sodium and the sodium: potassium ratio in drinking water to blood pressure should be investigated further.
The level of sodium in drinking water should be monitored and physicians informed of the level via local public health departments.
The sodium content of drinking water should not be increased purposefully. Safeguards should be taken against accidental increases, e.g., salting of roads in winter resulting in "deicing" runoff. In instances where water is to be softened (by ion-exchange) domestically, a three-line system is recommended so that only the water used for bathing and laundry would be softened-not the water for drinking.
Total review of the sources of sodium intake is urgently required along with publication of the sodium content of drinking water, beverages, foods, and drugs.
Potassium has four major biological functions. It contributes to the maintenance of electrolyte balance, the transmission of nerve impulses to muscle fibers and the control of normal muscle contractility, and the control of heart rhythm. and it acts as an insulin antagonist in intermediary carbohydrate metabolism.
Presence in Food and Water
According to Greathouse and Craun (1979), the mean concentration of potassium in household tap water is 2.15 mg/liter (minimum, 0.721 mg/liter; maximum, g.278 mg/liter). Concentrations of potassium in drinking water in Region V (defined by the U.S. Environmental Protection Agency) were 0.5 to 7.4 mg/liter in raw water and 0.5 to 7.7 mg/liter in finished water (U.S. Environmental Protection Agency, 1975).
Potassium is widely distributed in foods, both as a natural constituent and as an ingredient in food additives. In foods of plant origin, the commonest naturally occurring anions of potassium salts are nitrate (KNO3), sulfate (K2S04), phosphates (K2HPO4, KH2PO4, or K3PO4), and chloride (KCI). Amounts of these potassium salts vary with the plant as well as with methods of cultivation and fertilization (Grunes, 1978, personal communication).
Potassium-containing food additives include potassium alginate (a stabilizer, thickener, and emulsifier), potassium chloride (a gelling agent and a substitute for sodium chloride (NaCI) (Sopko and Freeman, 1977)], potassium iodate (K103) and potassium bromate (KBro3) (dough conditioners that are added to bread mixes), potassium nitrate used as a
TABLE V-9 FDA Total Diet Study ("Market-Basket")
Estimates of Potassium Intake4
Age Classification 1977 1978
6 Months Infants 1,551 1,590
2 Years Toddlers 1,714 1,846
15-29 Years Adults 4,549 4,735
a Data from shank, 1978. Values are based on an intake of 3,900
food preservative, monobasic and dibasic potassium phosphates (buffering agents and sequestrants), tribasic potassium phosphate (an emulsifier), potassium polymetaphosphate [(KPO3)x] (a fat emulsifier and a moisture retaining agent), potassium pyrophosphate (K4P207) (an emulsifier and a texturizer), potassium sorbate (CH3CH = CHCH = CHCOOK) (a preservative), potassium sulfate (a water corrective agent), and potassium bitartrate or cream of tartar (KHC4 H4 06) (an acidifying agent) (National Academy of Sciences, 1972).
Rich food sources of potassium are bran, dried brewer's yeast, cocoa, instant coffee, dried legumes, teas, spices, molasses, almonds, peanuts, raisins, peanut butter, avocados, pears, stewed prunes, parsley, bananas, potatoes, butter beans, dried, whole or nonfat milk, chocolate milk, oranges, orange juice, squash, and melon. Potassium is highly available in food.
The Food and Drug Administration's Total Diet Study ("market-basket" survey) estimates of potassium intakes in the United States for three age groups for 1977 and 1978 are shown in Table V-9.
Current dietary potassium intakes of adults are believed to range from 1,500 to 6,000 mg/day.
Dietary items with a very high potassium content may be consumed infrequently by young children and the elderly (Wilson et al, 1966). Younger men and women obtain enough potassium from their diets to satisfy nutritional requirements (Wilde, 1962).
The estimated adequate and safe intake of potassium for adults is between 1,875 and 5,600 mg/day. Intakes for infants and children are
given in Table V-31 at the end of this chapter (National Academy of Sciences, 1980).
Toxicity Versus Essential Levels
Older persons may have low potassium intakes. In a study of 46 men and
88 women aged 65 and over, who lived in their own homes in northern
Glasgow, Judge et al. (1974) observed that the mean dietary potassium
intake for men was 2,769 mg/day, and for women, 2,106 mg/day.
A gross reduction in dietary potassium intake can produce potassium depletion and a drop in serum potassium levels (Squires and Huth, 1959; Womersley and Darragh, 1955). Mohamed (1976) also observed low potassium intakes among elderly pensioners in southern Sweden. He analyzed actual food portions and beverages.
In an unpublished study of the diets of elderly housebound women and men in New York State during 1978, Roe (personal communication) discovered that the mean potassium intake was 2,071 mg/day (median, 1,893 mg/day; minimum, 703 mg/day; and maximum, 4,178 mg/day). Judge and Cowen (1971) showed that elderly people whose dietary intakes of potassium are less than 2.340 mg/day may have reduced handgrip strength. Overt potassium deficiency in adults is associated with intakes of 2.000 mg or less per day.
Major causes of potassium deficiency include prolonged vomiting or diarrhea, starvation. diabetic acidosis, surgery, use of diuretic drugs, use and abuse of cathartics. and intakes of corticosteroid hormones (Dargie et al., 1974; Food and Drug Administration. 1975; Krause and Hunscher, 1972; Robinson, 1967).
Nardone et al. (1978) estimated that approximately 98% of total body potassium is contained in the intracellular compartment of the body. Less than 2% is located in the serum where it can be extracted for measurement. Low serum potassium levels usually reflect total body deficit. However. in alkalosis, insulin therapy and hypoosmolality may
decrease serum levels of potassium (without a concomitant decrease in cellular potassium) so that they do not reflect actual body stores (Nardone et al.. 1978).
Potassium is excreted through the urine, gut, and skin. According to Nardone et al. (1978). the losses through the gastrointestinal tract and the skin are relatively minor under physiological conditions (Berliner, 1960; Suki, 1976).
Nardone et al. (1978) classified the origins of hypokalemia as follows:
Renal loss of potassium due to disease
Loss of potassium induced by drugs (e.g., diuretics, including organomercurials, thiazides, furosemide (Dargie et a!., 1974), and ethacrynic acid; antibiotics, including carbenicillin and penicillin; laxatives; corticosteroids; and nephrotoxic drugs, e.g., outdated tetracycline] and by licorice and extracts of licorice
Maldistribution of potassium caused by periodic paralysis (familial, acquired), drugs (e.g., insulin), and toxins (e.g., barium)
Drug-induced hypokalemia is extremely common. Hypokalemia resulting from low intake is less common. Several disorders, e.g., cancer, gross neurological disease, psychiatric illness, and chronic gastrointestinal disease, reduce total food intake and, thus, potassium intake. They can also lead to hypokalemia because of catabolism.
Hypokalemia can result from self-administration of excessive quantitites of diuretics, laxatives, or licorice, or from self-induced vomiting. Patients who induce hypokalemia by these means usually have an underlying psychiatric illness. In the young, this illness may be anorexia nervosa (Fleming et a!., 1975; Katz et a!., 1972; Wallace et al, 1968; Wrong and Richards, 1968).
Symptoms of potassium deficiency are weakness, anorexia, nausea, vomiting, listlessness, apprehension, and sometimes diffused pain, drowsiness, stupor, and irrationality. Hypokalemia can exist without any abnormal clinical findings. When symptoms are present, the most common is profound muscle weakness. Changes in electrocardiograms are also found (Zintel, 1968).
In hypertensive patients who are maintained on diuretics, potassium chloride can be used as a salt substitute to reduce sodium intake while providing a source of potassium.
Potassium depletion sensitizes patients to intoxication by cardiac glycosides such as digitalis. Potassium deficiency causes both structural damage and functional impairment of the kidney.
Keith et a!. (1942) investigated effects of single. large doses of potassium salts in seven normal persons. These subjects ingested from 12.5 to 17.5 g of potassium chloride or bicarbonate (KHCO3). Their renal clearances were then determined. In two subjects, the potassium load disturbed normal renal excretion. The authors estimated that single doses of
potassium salts containing 80 to 100 mg of potassium per kilogram of body weight may be nephrotoxic. Extracellular potassium, if rapidly raised by intravenous injection from 125 to 2,500 mg/liter, is toxic and may be lethal (Comar and Bronner, 1962).
It is not possible to produce hyperkalemia or potassium toxicity by dietary means in people with normal circulatory and renal function (Burton, 1965). Hyperkalemia is caused mainly by diseases such as Addison's disease or by renal failure with gross oliguria. Potassium toxicity can be caused by ingestion of enteric-coated potassium chloride tablets. Symptoms of this toxicity include gastric irritation, ulceration of the small intestine, and perforation of late strictures (Mudge and Welt, 1975).
Malabsorption of vitamin B12 has been identified in patients receiving slow-release potassium chloride supplements. Schilling test values of vitamin B12 absorption were normalized when potassium chloride was withdrawn from the regimens of these patients (Palva et al., 1972).
Symptoms of acute poisoning caused by eight 4-g potassium chloride tables were cyanosis. shallow respiration (Keith et al., 1942), and life-threatening cardiac arrhythmia (Maxwell and Kleeman, 1972). According to Blum (1920), 25 g of potassium chloride per day can induce acute toxicity. Smaller doses can cause diarrhea. Intakes of potassium associated with acute toxicity are 7 to 10 g/day in adults and 2 g/day in young children.
Metabolism of protein. amino acids, and glucose are affected by potassium status (Lehninger, 1970).
Focusing attention on the high sodium, low potassium environment in our society, Meneely and Ball (1958) and Meneely and Battarbee (1976) presented evidence that reduced sodium intake with a concurrent increase in potassium intake would benefit health, particularly for hypertensive or borderline hypertensive subjects. Potassium has a protective action against the hypertensive effect of high sodium intakes in humans and in laboratory animals, but the mechanism is unknown (National Academy of Sciences. 1970).
Contribution of Drinking Water to Potassium Nutrition
Because the levels of potassium in water are low in relation to those in foods. the contribution of water to the requirement or intake of potassium is negligible.
Table V-10 (see page 308) summarizes information on potassium in drinking water and the diet.
Potassium is abundant in the food supply, whereas water contributes little to total potassium intake.
Potassium deficiency is common in certain subgroups of the population, notably the elderly whose deficiency is attributed to low intake of potassium-rich foods as well as to the use of laxatives and diuretics. This deficiency is a cause of geriatric disability including severe muscle weakness. Frequently, it also produces digitalis toxicity, which is life-threatening.
Potassium excess leading to toxicity is not common and is not incurred through the diet. Acute potassium toxicity can be induced by oral potassium preparations including enteric-coated pills.
Studies should be directed toward defining more closely the RDA for potassium in different age groups, particularly the elderly.
Chloride is the most important anion in the maintenance of fluid and electrolyte balance and is necessary to the formation of the hydrochloric acid (HCl) in the gastric juices.
Presence in Food and Water
Rich sources of chloride are salt, breakfast cereals, breads, dried skim milk, teas, eggs, margarine, salted butter, bacon, ham, salted beef (corned beef), canned meats, canned fish, canned vegetables, salted snack foods, and olives. In the diet, chloride occurs mainly as sodium chloride (NaCl) (Harper et al., 1977)).
Chloride is found in practically all natural waters. Surface waters contain only a few milligrams per liter, whereas streams in arid or semiarid regions contain several hundred milligrams per liter, especially in drained areas where chlorides occur in natural deposits or are concentrated in soils through evaporation processes. Contamination with sewage increases the chloride content of river waters. Industrial wastes and drainage from oil wells or other deep wells and from salt springs
may add large quantities of chloride to streams. Most public water supplies contain less than 25 mg/liter. Groundwater usually contains larger quantities than surface water. Some public supply wells may contain as much as 100 mg/liter.
In the 1975 report on the Region V survey of the contents of selected drinking water supplies (U.S. Environmental Protection Agency, 1975b), the mean concentration of chloride in raw water was 18 mg/liter (SD +
17) and the content of finished water was 21 mg/liter (SD + 21). Concentations as high as 179 mg/liter were recorded, although 95% of the samples analyzed fell below 40 mg/liter. In a chemical analysis of interstate carrier water supply systems (U.S. Environmental Protection Agency, 1975a), II of 684 samples (1.6%) failed to meet the recommended drinking water limit for chloride which was set in 1962 by the U.S. Public Health Service (1962) at 250 mg/liter.
The U.S. Environmental Protection Agency (1977) has similarly set the secondary maximum contaminant level for chloride content in drinking water at 250 mg/liter, based on findings that are described below.
The presence of particular concentrations of chloride ion (CI-) in drinking water can produce a taste that is sometimes objectionable to the consumer. Water may be rejected on the basis of its chloride content. Whipple (1907) reported that subjects showed a differential ability to detect the chloride content of water which varied with type of chloride salt added. The chloride content of the water sampled by his subjects ranged from 96 to 560 mg/liter.
Lockhart et al. (1955) reported that taste thresholds for the chloride anion in water varied from 210 to 310 mg/liter, according to the type of chloride salt added. They noted that a high chloride content of water may cause an unpleasant taste in coffee. Richter and Maclean (1939) found that the chloride taste threshold was lower than that found by other authors.
Current dietary intakes of chloride vary largely with intake of salt. Estimates range from 2.400 to 14,400 mg/day from sodium chloride (Dahl, 1960).
Distribution in Tissues
According to Forbes (1962), the chloride concentration in humans is approximately 2,000 mg/kg of fat4ree body mass in the newborn and 1,920 mg/kg in the adult. Ziegler and Fomon (1974) believe it is reasonable to assume that the concentration of chloride in fat-free body weight gain after birth is approximately 1,920 mg/kg.
The essentiality of chloride is generally recognized but no recommended dietary allowances (RDA's) have been determined.
Ziegler and Fomon (1974) suggested that the chloride requirements for growth (alone) was 28 mg/day from birth to 4 months of age, 21 mg/day from 4 months to 12 months, 12 mg/day from 12 months to 24 months, and 12 mg/day from 24 months to 36 months. Advisable total intakes of chloride for infants in these four age groups are 245 mg/day, 210 mg/day, 245 mg/day, and 350 mg/day, respectively. A daily chloride turnover in adults (intake/output) ranges from between 3,018 and 8,875 mg. Cotlove and Hogben (1962) found that the loss of chloride generally parallels that of sodium.
Toxicity Versus Essential Levels
IMBALANCE AND DEPLETION
Electrolyte imbalances may disturb the absolute or relative amounts of chloride in the serum. Abnormalities of sodium metabolism are usually accompanied by abnormalities of chloride metabolism. When there are high losses of sodium, as in diarrhea, profuse sweating, or certain endocrine abnormalities, chloride deficit is also observed. However, when there is loss of gastric juice through vomiting, losses of chloride exceed sodium losses. This leads to a decrease in plasma chloride and a compensatory increase in bicarbonate (HCO3-). This results in hypochlo- remic alkalosis (Lennon, 1972). In Cushing's disease, or after the administration of an excess of corticotropin (ACTH), cortisone, or other corticosteroids, hypokalemia with an accompanying hypochloremic alkalosis may occur. Hypochloremia may also result when chloride is lost through profuse diarrhea, which impairs the reabsorption of chloride in the intestinal secretion (Harper et al., 1977).
The toxicity of salt containing the chloride ion depends mainly on the characteristics of the cation. The administration of hydrochloric acid (HCl), ammonium chloride (NH4Cl), lysine hydrochloride (NH3(CH2)4-CH(NK2)COOH]+(Cl]-, or arginine hydrochloride (HN=C(NH3)-NH(CH2)3CH(NH2)COOH)+(Cl]-adds to the quantity of readily dissociated acid (HCI) but is buffered by the bicarbonate ion (HCO3), leading
to an increase in the plasma concentration of chloride and a decrease in plasma bicarbonate. This results in hyperkalemic metabolic acidosis.
EFFECTS OF CHLORIDE LOADING
Adaptation to sodium chloride load may occur in human subjects. In experiments on isolated frog skin, Watlington et al (1977) showed that extracts of the urine of humans on a high sodium chloride intake produced a net active transport of mediated chloride ion efflux. On the other hand, such activity was not induced by the urine of either normal humans who had been deprived of sodium or humans with adrenal insufficiency who had been loaded. These findings suggest the presence of an adrenal corticosteroid, which may participate in adaptation to high salt intake.
Normally, the only halogen in the extracellular fluid is chloride. Chloride may be partially replaced by bromide when bromide is taken as a medication over a prolonged period. Each mEq of bromide retained displaces I mEq of chloride, but total halide, i.e., total chloride and bromide concentration, remains unchanged. Many of the chemical and biological properties of chloride and bromide are similar, but renal tubular transport differs. The renal clearance of bromide is slightly less than that of chloride, indicating that the tubular epithelium retains bromide preferentially. A progressive rise in bromide and falling chloride concentrations result from long-term ingestion of rather small doses of bromide. Chronic bromism is treated by increasing urinary bromide excretion. Any treatment that increases chloride losses will also result in increased bromide losses. Therefore, administration of a chloride source, for example, such as sodium chloride, and such diuretics as furosemide (C12 H11 ClN2O5S) and ethacrynic acid (C13 H12 C1204) has been used to treat bromism (Emmett and Narins, 1977).
Contribution of Water to Chloride Nutrition
Since no recommended dietary allowance (RDA) exists for chloride it is not possible to assess the contributions of drinking water to the nutritional requirement for chloride.
A typical chloride concentration in drinking water of 21 mg/liter would contribute 42 mg/day (assuming 2 liters/day consumption). This would be just under 2% of the lower estimates of total chloride intake.
The highest chloride concentration observed (179 mg/liter) would contribute 15% to the lowest total intake.
The chloride content of waters varies with the geochemistry of the area and contamination from sewage, industrial, or other wastes.
Concentrations above 250 mg/liter chloride cause a salty taste in water which is objectionable to many people.
Consumption of chloride in reasonable concentrations is not harmful to most people. However, if the chloride is present as sodium chloride, the sodium ion may be undesirable to persons requiring salt restriction.
Typical chloride concentrations in drinking water contribute relatively little to total chloride intakes.
Representative chloride intakes from water and food should be determined by region, by locality, and by sex/age groups.
Implications of high chloride ingestion require further investigation. Means of minimizing the entry of excess chloride into drinking water
supplies should be studied.
In this section the term iodine, when used in a general sense, denotes all iodine-containing compounds, e.g., iodate (IO3-), iodide (I-), etc.
Iodine is an essential micronutrient. It is an integral constituent of the thyroid hormones. thyroxine and triiodothyronine, which have important endocrine functions in metabolic regulation.
Presence in Food and Water
Sources of iodine include foods, water, internal and topical medications, and air (Underwood, 1971). In the United States, the major contributions to iodine intake come from iodized salt, bread, milk, marine fish, and seafood. Eggs. other animal protein foods, the food coloring erythrosine, water, human milk, kelp, vitamin-mineral supplements, and formula foods also contain iodine. These dietary sources of iodine are highly available as indicated by relationships between intake and urinary excretion of iodine (Kidd et al., 1974).
The origins of iodine in foods are soils, water, commercial fertilizers, atmospheric iodine, iodine-containing antiseptics, food additives, and food or water pollutants. Major sources of iodine in milk are the iodophors-iodine-containing antiseptics that are used to cleanse cows, udders and to "sterilize" milking equipment or food preparation areas. The Wisconsin Alumni Research Foundation (WARF Institute, 1977) reported high iodine concentrations in milk shakes that were prepared in fast-food restaurants.
Iodine in seafoods is derived from ocean waters, which contain approximately 0.06 mg/liter, mainly as iodate, but also as iodine. Breads contain a variable amount of iodine, depending on the source and means of production. Breads made with dough conditioners consisting of calcium or potassium iodate [Ca(103)2, K103] contain much higher iodine concentrations. Breads made by a continuous mixing process with iodate have a higher iodine content than those produced by conventional mixing with iodate (Hemken et a'., 1972; Kidd et al, 1974; National Academy of Sciences, 1974a).
Erythrosine (the disodium salt of tetraiodofluorescein), a food-coloring agent, contributes iodine to the diet when it is used in the manufacture of certain breakfast cereals and other foods such as fruit jellies (Vought et al, 1972).
Table salt is iodized to furnish 76 ug of iodine per gram of salt. As of 1968, 54.8% of the table salt sold in the United States was iodized. Use of iodized salt varies with region of the country. The amount of iodine added to the diet via table salt is extremely variable, not only because of differences in the use of iodized versus noniodized salt, but also because salt intake varies markedly. In a population containing both high and low salt users, it is difficult to use the average intake of salt at 10 g per day per person (National Academy of Sciences, 1974b) to calculate intakes of iodine from salt.
Iodine in marine fish and shellfish is presumably derived from sea water and, especially, from marine plants, which have the highest concentrations of iodine of any plant species (Chilean Iodine Education Bureau, 1950).
Drinking water contains a small and variable amount of iodine, which is determined by location, water treatment processes used, and the degree of pollution. Among water-processing methods, flocculation with alum and sedimentation appear to reduce iodine content. Chlorination, when used alone, results in only a small loss of iodine (<10% reduction in the iodine content of raw water).
Iodine enters drinking water from atmospheric iodine (via rain or
snow), soil, and, in the case of polluted drinking waters, from decaying plants, animal excretions, and commercial fertilizers. Water containing feces, urine, or plant debris contains more iodine than unpolluted water from the same area (Vought et al, 1970).
Freshwater contains 0 to 2A ug/liter in areas where goiter is endemic and 8 to 9 ug/liter in goiter-free areas (Fisher and Carr, 1974). Surface water, more often consumed by domestic animals, contains 4 to 336 ug/liter (National Academy of Sciences, 1974b).
Concentrations of iodine at 4 to 8 ug/liter in raw (untreated) water and 3A to 3.8 ug/liter in treated water in Potomac, Maryland, and up to 18 ug/liter in polluted wells in Virginia have been reported.
Average dietary iodine intake has been estimated both from dietary studies and from analysis of thyroidal radioiodine uptakes by the thyroid. Oddie et al. (1970) studied radioiodine uptakes that were reported by 133 observers from approximately 30,000 euthyroid subjects throughout the United States. These estimates indicated that daily iodine intakes in various sections of the country varied from approximately 240 to 740 ug/day.
The recommended dietary allowances (RDA's) for the intake of iodine by adults range from 0.08 to 0.140 mg/day, depending upon age and sex (National Academy of Sciences, l974c). The full list of recommendations is shown in the overall summary section at the end of this chapter. An intake of less than 0.05 mg/day leads to endemic goiter (National Academy of Sciences, 1974a).
Toxicity Versus Essential Levels
Both iodine deficiency and excess can enlarge the thyroid, a condition termed goiter. Endemic goiter due to iodine deficiency, which was prevalent in the United States before salt was iodized, is now uncommon. A reduction in the incidence of endemic goiter may also be due in part to the use of breads containing iodate. Measurement of the urinary excretion of iodine suggests that moderate iodine deficiency still occurs in the United States. In the National Nutrition Survey conducted from 1968 to 1969, only a small percentage of persons sampled had visibly enlarged thyroid glands. For example, McGanity (1970) reported that 5.4% of the individuals examined in one study in Texas had palpable or visibly enlarged thyroid glands. Eleven individuals, or approximately 0.4% of this sample, had urinary iodine levels of less than 50 ug/g
creatinine, but none of their thyroids was enlarged. However, 9% of those whose iodine excretion was less than 100 ug did have enlarged thyroid glands (Matovinovic, 1970).
In the l968-1969 Texas nutrition survey (McGanity, 1970), there was no evidence that the iodine content of drinking water was related to the incidence of enlarged thyroid. Furthermore, there was no relationship between the incidence of enlarged thyroid and the fluoride content or hardness of water.
In a study that was conducted in Virginia, Vought et al (1967) reported that thyroid disease in children is not related to dietary iodine deficiency, but rather to contaminated water. They isolated cultures of microorganisms from contaminated waters and postulated that goitrogens known to be produced by these organisms might interfere with iodine uptake by the thyroids of the affected children.
Plant goitrogens have been implicated as a factor contributing to endemic goiter in many parts of the world, particularly in areas such as the Congo, where there is also dietary iodine deficiency (Delange and Ermans, 1971). While there is no evidence that plant goitrogens play a role in the production of enlarged thyroid or thyroid disease in the United States, too little is known about the possible effects of low doses of goitrogens on the availability of iodine to the neonate or to the developing fetus (Stanbury, 1970). During pregnancy iodine deficiency can impair the development of the fetal thyroid thereby producing cretinism. Endemic cretinism does not occur in the United States.
Goiters resulting from iodine overload have been well described in the United States and other countries. Although goiters can be produced by excessive dietary iodine intakes, the more common cause is ingestion of large quantities of iodine-containing medications. Wolff (l969) has divided iodine goiter into four groups: (I) adult iodine goiter, mostly in asthmatic subjects taking iodine-containing cough medicines; (2) iodine goiter of the neonate due to placental transfer of iodine from mothers who are being treated with iodine; (3) endemic iodine goiter, which is of dietary origin; and (4) hypothyroidism in patients with thyrotoxicosis (Graves' disease) who are being treated with potassium iodine (KI) or Lugol's solution (4.5-5.5 g of iodine and 9.5-l0.5 g of potassium iodide per 100 liters of purified water). Most people who have developed iodine goiter have received very large amounts of iodine for prolonged periods. In the iodine goiter cases reviewed by Wolff, intakes of iodine ranged from I 8 mg to more than I g/day over several months. When iodine goiter develops under these conditions. a secondary complication may be hypothyroidism with clinical signs of myxedema. Prenatal development of iodine goiter carries the risk of obstructed delivery or neonatal
tracheal obstruction. According to Wolff, there is a danger of iodine goiter from prolonged intakes of iodine above I to 2 mg/day.
In Northern Tasmania, two waves of increased prevalence of thyrotoxicosis have been attributed to iodine excess. In 1964, the incidence of thyrotoxicosis in Northern Tasmania increased. This was attributed to the use of iodophor disinfectants on dairy farms which, as previously stated, causes iodine residues to be present in milk. In 1966, another epidemic of iodine-induced thyrotoxicosis occurred in the same country. This time it was precipitated by the addition of iodate to bread to prevent endemic goiter (Stewart and Vidor, 1976).
Liewendahl and Gordin (1974) reported a case of iodine goiter in a woman who ingested seaweed for 2 years. Hyperthyroidism also occurred in this patient.
Stanbury (1970) cited an unpublished report of a study in Iceland, where the iodine intake is high (from 0.500 to 1.500 mg/day) because of the prevalence of fish in the diet. The investigator also reported that the incidence of papillary carcinoma of the thyroid was high in Iceland. In parts of Japan, where large intakes of iodine result from the local custom of eating seaweed, carcinoma of the thyroid is more prevalent than in any other country (Suzuki et al, 1965). It has been suggested that the persons at risk of thyroid carcinoma from high iodine intake are those with preexisting thyroid adenoma or goiter.
Furszyfer et al. (1970) called attention to a rise in the prevalence of subacute (granulomatous) thyroiditis in Olmstead County, Minnesota, between 1960 and 1967. In a subsequent study (Furszyfer et al., 1972), they reported that the prevalence of Hashimoto's disease (1ymphomatoid thyroiditis) in Rochester. Minnesota, had increased from 6.5 per 100,000 females during 1935-1944 to 69.0 during 1965-1967. They suggested a relationship to excess iodine intake.
Iodine may produce acneiform skin eruptions. Sources of iodine cited as being responsible for production of iododermas are iodized salt, iodides in therapeutic vitamin-mineral preparations, and iodine in formula foods such as Metrecal.
Anaphylactic reactions as well as acneiform eruptions and furunculosis (boils) may follow intravenous administration of iodine preparations used as contrast substances for intravenous pyelograms and gall bladder or spinal X-rays (Baer and Witten, 1961).
Lead has an adverse effect on the uptake of iodine by the thyroid gland. Persons with lead-poisoning from industrial exposure or from ingestion
of lead-contaminated, illicitly distilled whiskey have developed impairment of iodine uptake by the thyroid (Sandstead, 1977).
Contribution of Drinking Water to Iodine Nutrition
CONTRIBUTION OF DRINKING WATER TO IODINE REQUIREMENTS
Assuming 2 liters/day consumption of drinking water and total iodine requirements in the range of 0.080 to 0.0150 mg/day, low iodine waters (approximately 0.001 mg of iodine per liter) would provide 1% to 2% of the total requirement, medium iodine waters (0.004 mg/liter), 5% to 10%, and high (polluted) iodine waters (0.018 mg/liter), 24% to 44%.
CONTRIBUTION OF DRINKING WATER TO TOTAL INTAKE
Given the highest level of iodine in water at 0.018 mg/liter and a total intake of iodine equivalent to 0.240 mg/day, the contribution of water (2 liters/day) would be approximately 15%. Minimal contributions to total body burden would be made by low iodine waters if high iodine intakes from food were consumed (at a level of 0.740 mg/day). Under these conditions, iodine would contribute 0.3% to total intake. Where dietary intake of iodine is low and drinking water is obtained from polluted wells with high iodine content, the water could contribute to the prevention of iodine deficiency. However, in view of Vought's findings of bacterial goitrogens in polluted water (Vought et al., 1970), this seems unlikely. Iodine toxicity is unlikely to be related to water intake unless water was highly contaminated with iodine.
Table V-10 summarizes information on iodine in drinking water and the diet. The average intake of iodine from all sources appears to be at least twice the RDA. Reduction of iodine intake to approximate the RDA is desirable. In some cases, reduction of the level in water would contribute to reduction of intake, but reduction of intake from other sources may be more practical.
Further data should be obtained on the iodine content of water supplies. Microbiological examination of high iodine waters for fecal and
urinary contamination should be performed.
TABLE V-b Contribution of Drinking Water to Requirements for Iodine, Sodium, and Potassium (values based on a 70-kg adult man and water intake of2 liters/day)4
Sodium Potassium Iodine
Requirement per day, mgb Ratio of toxic intake level to dietary requirement
Mean Concentration in water, mg/liter
Intake based on mean concentration in water, mg/2 liters
Intake from water based on mean, % of requirement
Highest concentrations observed Highest concentration in water,
Intake based on highest observed concentration in water, mg,2 liters Intake from water based on highest observed concentration, % of
2,500 2,500 0.130
5.5 Unknown 7.7
27.7 2.15 0.004
55.4 4.3 0.008
2.0 2.0 6.0
79.7 8.3 0.018
159.4 16.6 0.036
6.0 7.0 28
References to the data in this table will be found in the text. Data available on chloride were not considered sufficient for compiling an assessment for that nutrient.
b These are intermediate values, selected for illustrative purposes, from recommendations of
the National Academy of Sciences (1974c, 1980).
Toxicity is expressed in terms of the ratio of intake to requirement which, over the long term, causes mild to severe signs of toxicity Food and Drug Administration, 1975).
Methods for the reduction in the iodine content of high iodine waters should be investigated.
Changes in total iodine intake over time by the U.S. population should be studied by monitoring individual foods and total "market-basket" samples.
Extraneous sources of iodine due to air and water pollution, use of iodophors, use of erythrosine, and use in vitamin-mineral preparations should be reduced.
The relationships between acne and acneiform eruptions and iodine
intake from water as well as from food or other sources such as medication should be examined.
Throughout the world, including the United States, iron deficiency is one of the most commonly recognized signs of inadequate nutrition. This situation exists in spite of the fact that iron is among the most abundant elements in the earth's crust. There are several reasons for the anomaly. Man has developed an effective mechanism to prevent excess absorption of iron. This protective device is important because iron is poorly excreted and is highly toxic when tissue levels rise above the tolerance level. Iron compounds tend to be insoluble and the iron of such compounds is inefficiently absorbed. Thus, while the quantity of iron consumed is important, the chemical form of the iron is also a highly significant factor in meeting the dietary requirement. In view of these considerations it is understandably difficult to assess the dietary iron requirement of humans.
Presence in Food, Water, and Air
The concentration of iron in foods consumed by humans varies widely, ranging from less than 1 mg/kg in milk and related products to approximately 50 mg/kg in dry beans and cereals. See Table V-Il (page 318) for a tabulation of data on the amount of iron contained in food groups and the percentage of the mineral contributed by the food groups to the total food intake of persons in U.S. households (Consumer and Food Economics Institute, Science and Education Administration, U.S. Department of Agriculture, unpublished data). Approximately 35% of the dietary iron comes from meat, fish, and eggs, while 50% is supplied by cereals, root vegetables, and other foods of plant origin.
The median iron concentration in surface air layers at 38 U.S. nonurban sites was 0.255 ug/m3 (National Academy of Sciences, 1979). Twenty cubic meters of such air (the average volume inhaled per day) would contain approximately 5 ug of iron. Even if totally absorbed, this quantity would make a negligible contribution to the daily intake of iron.
An estimate of the iron content of drinking water and its contribution to the iron requirement of the U.S. population are given in Table V-12 (in the section on zinc). While the concentrations of iron in raw water and waste waters are highly variable and, in some cases, quite high, this report is concerned only with finished water, most particularly with tap
water. Of 672 water samples collected from interstate carriers (suppliers) of water and analyzed for the U.S. Environmental Protection Agency (1975), 62.5% contained iron concentrations that could be estimated quantitatively. The average concentration in these 420 water samples was 0.240 mg/liter. The samples were collected from 10 regions in the continental United States. The mean of the maximum values was 2.180 mg/liter. In another EPA study (Craun et al., 1977; Greathouse et al., 1978), tap water samples from 3,834 residences in 35 regions of the United States were analyzed. The mean and maximum concentrations of iron in these samples were 0.245 and 2.180 mg/liter, respectively.
The requirement for most nutrients, including iron, varies with the age and physiological state of the individual, but the difference between the male and female requirement for iron is greater than for most nutrients. This stems largely from the blood loss of females during the reproductive period and the increased demand during pregnancy.
Iron is inefficiently absorbed. Consequently, to meet the actual daily requirement for absorbed iron (approximately 1.0 mg for males and 1.5 mg for females), from 10 to 20 times that quantity must be ingested. The percentage of iron absorbed depends on the iron status of the individual, i.e., absorption is greater in persons with iron depletion. There are also differences in availabilities among the various iron compounds in the diet. To assure adequate intake for the majority of the population, the recommended dietary allowance (National Academy of Sciences, 1974) is 10 mg for adult males and 18 mg for females of reproductive age. See Table V-12 (in the section on zinc) for requirements and toxicity of iron compared to similar information for copper and zinc.
Toxicity Versus Essential Levels
When administered parenterally, iron is a highly toxic element. Humans are generally well protected from oral overdose, but children from 1 to 2 years of age are particularly vulnerable to iron toxicity from ingestion of iron supplements that have been commercially prepared for adults (Fairbanks et al, 1971). In general, the long-term toxic levels of dietary iron for monogastric animals is 340 to 1,700 times greater than the requirement. Such continuous intake may give rise to signs of toxicity (Food and Drug Administration, 1975).
The bioavailability of iron in foods varies widely. For example, iron in the form of heme is absorbed nearly 10 times as efficiently as iron in food of plant origin. Practically nothing is known about the absorption of iron from water. As a matter of fact, little is known about the chemical species of iron from drinking water at the tap. In a well-aerated river the dominant form is ferric iron (Fe3+). Groundwater may contain appreciable ferrous iron (Fe2+). Surface waters and groundwaters also contain organic complexes of iron (National Academy of Sciences, 1979). The fractions of these forms in water that are absorbed by humans are unknown, but it is clear that reducing agents, such as ascorbic acid, increase the absorption of iron in food (Monsen et a!., 1978). Ferrous iron appears to have better bioavailability than does ferric iron. The iron in certain chelates, such as ferric phytate, is poorly absorbed (Bowering et a!., 1976). Although it is generally assumed that trace elements in water are readily absorbed. there are few, if any, data relative to the bioavailability of iron in water.
Iron interacts physiologically with several nutritionally essential and nonessential elements. All of these elements, including copper, zinc, manganese, and lead, tend to increase the requirement for iron. Signs of copper toxicosis are eliminated by the addition of extra iron and zinc to the diet, and signs of zinc toxicosis are prevented by the addition of extra copper and iron (Magee and Matrone, 1960). The signs of lead toxicity are exacerbated by iron deficiency. Perhaps the most significant interaction of any mineral with iron is that of manganese. Excess manganese impairs hemoglobin regeneration by decreasing the absorption of iron (Underwood, 1977). (See the section on manganese for further discussion of the iron-manganese interaction.)
Contribution of Drinking Water to Iron Nutrition
Assuming 2 liters/day consumption of water containing an iron concentration equal to the mean value shown in Table V- 12, water would contribute approximately 0.5 mg of iron, which is about 5% of the male requirement and less than 3% of the female requirement. For those persons consuming water containing the highest observed value, water would contribute from 17% to 44% of the daily requirement, depending on sex.
In the continental United States. most tap water probably supplies less than 5% of the dietary requirement for iron. This may be considered a
negligible contribution unless the iron in water has an appreciably higher bioavailability than iron in food. However, iron deficiency is common in the United States. Under severely limiting conditions, 0.5 mg of highly available iron from water would make a significant contribution to the daily dietary intake. If a local water supply contained unusually high concentations of iron, it could contribute substantially to the total intake. The iron content of drinking water should not be reduced since there is little or no likelihood of toxicity from iron in natural foods and water. It should be noted that the present recommended limit for iron in water, 0.3 mg/liter, was based on taste and appearance rather than on any detrimental physiological effect from iron in water.
The chemical species of iron in drinking water and their bioavailability should be determined.
While there is no evidence of copper deficiency in the U.S. population, except for isolated cases in patients maintained by total parenteral nutrition, copper is clearly an essential nutrient. There is some evidence that the intake is lower than required for optimal human nutrition. Klevay (1975) suggested that borderline deficiencies may occur among portions of the population. The concentration of copper in the earth's crust is estimated to be 50 mg/kg. It forms organic complexes readily and tends to concentrate in clay minerals, particularly in clays that are rich in organic matter. Copper in rocks is mobilized more readily under acidic rather than alkaline conditions (National Academy of Sciences, 1977). The species of copper in drinking water at the tap have not been determined, but copper presumably occurs in the oxidized, Cu(II) state complexed with various ligands. The reaction of soft water with the copper pipes that are used in some household plumbing systems contributes to the copper levels in water at the tap (Schroeder et a!., 1966).
Occurrence of Copper in Food and Water
The concentrations of copper in foods are highly variable. They are extremely low in dairy products and relatively high in cereals and roots.
Table V-11 (see page 318) shows the distribution of copper among food groups. These data suggest that the average copper intake is less than 2 mg/day. Klevay (1975) has presented evidence that many U.S. diets contain much less copper than required. For example, he quotes studies showing that the dietary copper intake of high-school girls and college women may be as low as 0.34 to 0.58 mg and that the diets of other individuals may supply less than 1 mg/day.
The estimated contribution of drinking water to an adult's copper requirement is shown in Table V- 12 (in the section on zinc). Because the concentration of copper in drinking water is highly variable, means are of limited significance. Approximately 55% of the 604 water samples analyzed by the U.S. Environmental Protection Agency (1975) contained measurable levels of copper. The mean of these samples was 60 ug/liter. The mean of another study (Craun et al., 1977; Greathouse et al., 1978) was 150 ug/liter.
Signs of copper deficiency have been observed in patients maintained totally by intravenous alimentation. but there is only one report of copper deficiency in children fed natural food by mouth (Meng, 1977). Since signs of dietary copper deficiency in the United States have not been observed among persons consuming commonly available foods, it has been assumed that the usual intake meets the requirement. The National Academy of Sciences Food and Nutrition Board did not previously set an RDA for copper but has recently estimated an adequate and safe intake of 2 to 3 mg/day (see Table V-3 1; National Academy of Sciences, 1980). Copper intakes between 1.3 and 2 mg have been shown to maintain nutritional balance in preadolescent girls and adults of both sexes (National Academy of Sciences, 1974).
Toxicity Versus Essential Levels
Copper is toxic to monogastric animals when ingested in quantities that are 40 to 135 times greater than their respective requirements (Food and Drug Administration. 1975). Except for sheep, all animals absorb copper poorly because their gastrointestinal tracts provide an excellent barrier against oral toxicity. The greatest danger of toxicity arises when children consume acidic beverages that have been in contact with copper
containers or valves (Food and Drug Administration, 1975). The interim drinking water standard (U.S. Environmental Protection Agency, 1977) of I mg/liter is based on taste rather than toxicity and affords adequate protection to the general public. However, a few patients with Wilson's disease (11epatolenticular degeneration) are adversely affected by the estimated average intake of copper (Scheinberg and Sternlieb, 1965).
Interactions and Bioavailability
Copper probably occurs in drinking water in the form of the cupric ion (Cu2+) complexed with organic ligands, but this has not been determined. It is reasonable to assume that it is as biologically available as the copper in food, if not more so. High levels of ascorbic acid adversely affect the absorption and metabolism of copper, but few other organic dietary constituents are known to affect its bioavailability (Carlton and Henderson, 1965; Hill and Starcher, 1965; Hunt et al., 1970).
The interaction of two essential trace elements with copper increases the requirement of humans for copper. For example, high levels of zinc exacerbate the signs of copper deficiency in mammals. This effect can be reversed by feeding extra copper to the subject (O'Dell et al., 1976). The antagonism of molybdenum to copper is augmented by sulfate (SO4). This interaction is particularly significant in ruminant animals but may be of little importance in humans. Copper, sulfur, and molybdenum form an insoluble copper thiomolybdate complex (Dick et al, 1975). Silver and cadmium, both nonessential elements, also interact with copper to exacerbate signs of deficiency (Underwood, 1977).
Contribution of Drinking Water to Copper Nutrition
If one assumes a typical concentration of copper in drinking water of 0.1 mg/liter, a human would obtain 0.2 mg of copper from 2 liters of water. This constitutes between 6.0% and 10% of the estimated adequate and safe intake. In view of the data assembled in Table V-11 (see page 318) indicating that the typical copper intake from food is less than 2 mg/day, and other observations (Klevay, 1975) suggesting even lower consumption of copper from food, the contribution of water to total copper intake becomes even more significant. Furthermore, some drinking water contains considerably higher levels of copper than 0.1 mg/liter and would contribute a correspondingly greater proportion of the total intake. Waters containing the average reported maximum copper