Written by: Lauren Milligan Newmark, Ph.D. | Issue # 70 | 2018
- Human breast milk contains only a small amount of iron, but full-term human infants are born with a substantial supply of iron stored in their liver.
- There is debate among nutritionists and clinicians as to whether exclusively breastfed infants require iron supplementation and when to begin supplementation with iron-fortified foods such as cereal.
- Iron is not water-soluble, making it critical for policy makers to weigh the risks of potential iron deficiency with the risks of providing more iron than needed, particularly in iron-replete populations.
Breast milk is considered the gold standard for human infant nutrition. But at some point, even “white gold” cannot suffice as the only source of nutrition, and infants must begin to take in complementary foods to support their growth and development. When that point is, however, remains a matter of debate centered largely on the availability of one particular micronutrient—iron.
Human milk is very low in iron, usually containing between 0.3–0.4 mg/L [1-4]. To compensate for this low dietary intake, human infants are born with significant liver stores of iron that accumulate mainly during the third trimester of gestation. Many researchers propose that the combination of stored iron in the liver and highly bioavailable milk iron are sufficient to facilitate optimal growth and development until infants reach six months of age [1, 2, 5-9], whereas others believe the scientific evidence supports an earlier introduction of iron-fortified foods or supplements at four months of age [3, 4, 10, 11]. Although it seems like a simple solution to just err on the side of avoiding iron deficiency in early infancy, more is not necessarily better; the risks of not consuming enough iron must be carefully weighed against the risks of providing infants with too much iron when deciding if and when breastfed infants need extra iron.
Iron in Our Diet and Our Bodies
Iron deficiency (ID) is the most prevalent nutritional deficiency in the world, and is especially prevalent in children and women because of their higher iron requirements (the former because of growth, the latter because of reproduction and menstruation) [1, 2, 12]. Because infancy also is a period of rapid growth, ID in infants has become a public health concern.
Iron is the fourth most abundant element on earth—so why are so many people deficient? Iron is a tricky mineral to absorb, and it is estimated that humans only absorb between 5–35% of the iron available in a food [13]. Heme iron, which is the type of iron found in animal tissues, is associated with greater iron absorption than non-heme iron from plant sources. In addition, many foods can inhibit the body’s ability to absorb iron, such as calcium (which is itself a tricky mineral to absorb). However, iron found in breast milk has high bioavailability, and infants are able to absorb up to 50% of the iron ingested in breast milk [1].
Iron plays multiple roles within the body but is probably best known for its association with red blood cells. Red blood cells have proteins called hemoglobin, which are composed of four polypeptide chains. Each chain contains an iron atom responsible for binding oxygen. As red blood cells move throughout the body, the iron on the hemoglobin chains (referred to as heme iron, hence why it is found in animal tissues) is essential for delivering oxygen to cells throughout the body, and in binding carbon dioxide and returning it to the lungs. Moreover, iron is required for manufacturing of red blood cells. Oxygen may be essential for life, but oxygen delivery is completely dependent on the availability of iron.
Most of the body’s iron is bound to hemoglobin (approximately 65%) and iron hemoglobin stores are conserved to the detriment of other cells in the body [13]. Thus, in periods of iron depletion and deficiency, it is possible to have normal hemoglobin levels but be iron deficient. Indeed, by the time the hemoglobin is affected (referred to as iron deficiency anemia, or IDA), an individual is likely to have been iron deficient for some time.
Because of the role of iron in growth and development, ID in infants and children is associated with poor development of cognition, gross and fine motor skills, and even social and emotional development [1, 2]. These risks are even greater when ID has progressed to IDA, and both conditions are associated with increases in morbidity [13].
On the other side of the coin are the potential risks that come with consuming too much iron. Some micronutrients, such as vitamin C, are water soluble. You can consume 300% of your daily value of vitamin C without concern because your body will simply excrete the excess in urine. This is not the case for iron; the body has no pathway to excrete excess iron, and even moderate excess has the potential to interfere with normal physiological processes [1, 2, 13]. Excess iron can affect the body’s stores of other essential minerals, (including copper and zinc), increase oxidative stress (because iron is a pro-oxidative element), and slow growth (in both weight and height) in infants and children [1, 2]. Moreover, excess iron may increase the frequency or severity of infections, particularly those that affect the gastrointestinal tract [2].
In adults, iron levels are regulated in the intestines at the time of absorption [1, 13]. Individuals that are iron-replete (i.e., have adequate iron stores) absorb less iron than those with low iron status [1, 13]. However, Lönnerdal [1] reports that these mechanisms of iron homeostasis are not fully functional in infants. Results of several studies on iron supplementation of infants less than 12 months of age suggest they may be better at increasing iron absorption when iron status is low than decreasing absorption when iron status is sufficient [1]. From an evolutionary perspective, this is a very interesting finding. That infants are better at dealing with too little compared with sufficient or too much iron suggests low iron availability during early infancy may have been the rule rather than the exception (more on this later!). Thus, Lönnerdal [1, 2] warns that supplementing infants that are iron-replete with iron or over-supplementing iron-deficient infants can result in poor health and developmental outcomes.
Iron Supplementation in Breastfed Infants
It seems paradoxical to provide nutritional supplements to exclusively breastfed infants—doesn’t human milk provide infants with all of the required macro- and micronutrients in exactly the right concentrations? When it comes to iron, human milk on its own is definitely not optimal, providing approximately 0.4 mg/L. But human milk is not the only source of iron to newborns and young infants. During the final trimester of gestation, fetuses accumulate large liver stores of iron. The combination of milk iron (which is highly bioavailable) and iron stored in the liver is sufficient to support the growth and developmental needs of infants but only for a finite amount of time. When do these stores run out?
Unfortunately, there is not a consensus among nutritionists and clinicians on when exclusively breastfed infants require external sources of iron. Currently, the American Academy of Pediatrics (AAP) Committee on Nutrition [11] recommends supplementing breastfed infants with iron beginning at four months of age, whereas the AAP Committee on Breastfeeding [9] along with the World Health Organization (WHO) [8] do not recommend iron supplementation (and introduction of solid foods) until 6 months of age. The AAP Committee on Nutrition [11] make their recommendations based on a small number of papers [e.g., 4] that demonstrate that the proportion of infants with ID and IDA increases between four months and six months of age, as well as a positive association between iron supplementation of breastfed infants at four months and improved iron status and psychomotor scores. However, Lönnerdal and colleagues [1, 2, 5], along with the AAP Committee on Breastfeeding, argue that there is still insufficient evidence to suggest a need to provide external iron sources (i.e., iron drops or earlier introduction of iron-fortified cereal) before infants reach six months of age. They emphasize that we currently do not know whether iron supplements lead to improvements in growth, decreases in growth, or no effect at all [5]. Whereas Friel [4] found improved growth with iron supplementation, Lönnerdal and colleagues found less growth in height and head circumference in iron-supplemented infants [5].
It is critical to recognize that exclusively breastfed infants are not a homogenous group. Although on somewhat opposite sides of the debate, nutritional scientists Friel [3] and Lönnerdal [1] both agree that there are certainly some infants who would greatly benefit from iron supplementation prior to six months of age, including infants born prematurely, multiples, and infants whose mothers have severe anemia [1–4, 10]. However, blanket policy guidelines for all infants from diverse ecological and economic settings could put some infants at risk for ID (and IDA) or others for excess iron.
Not Enough or Just Right?
Adding another perspective to the debate on iron supplementation is evolutionary anthropologist EA Quinn [6]. Quinn argues against the view that low milk iron and low iron stores are pathological and in need of intervention. Instead, Quinn proposes that low milk iron and lower milk iron stores may actually provide an adaptive advantage to infants that are transitioning to solid foods. Infants are immunologically naïve, and the transition to non-milk foods means the first introduction of food-borne pathogens, including bacteria and parasites. Pathogens, like people, need iron for growth and replication. If infant iron stores are low at the time of complementary food introduction, the availability of iron for pathogens is also low, which could function to decrease the frequency and severity of infections [6].
Moreover, Quinn [6] argues that by supplementing infants with iron too early, or by adding more iron to infant formulas than is available from human milk, we may inadvertently be promoting the growth of pathogenic, iron-requiring bacteria in infant gut microbiomes. The types of bacteria that initially colonize the infant gut may act as seeds and select for future generations of bacteria involved in immune function and metabolism regulation. Thus, excess iron during early infancy has the potential to negatively impact health during infancy and throughout the lifespan [1].
Indeed, many of the differences in health outcomes between formula-fed and breastfed infants that are usually attributed to the lack of immune factors in formula may be explained, at least in part, by different intakes of dietary iron [1, 6].
Taken together, Quinn’s [6] evolutionary perspective on low milk iron and declining iron stores and Lönnerdal’s [2] findings on the potential risks for consuming too much iron suggest that policies about when to supplement with iron-fortified food (and how much iron should be used) should consider both the potential benefits and risks based on particular infant attributes. When it comes to iron, more is not always better.
References
1. Lönnerdal B. Development of Iron Homeostasis in Infants and Young Children. Am J Clin Nutr. 2017;106(Supplement 6):1575S-80S.
2. Lönnerdal B. Excess Iron Intake As A Factor In Growth, Infections, and Development of Infants and Young Children. Am J Clin Nutr. 2017;106(Supplement 6):1681S-7S.
3. Friel JK. There Is No Iron in Human Milk. J Pediatr Gastroenterol Nutr. 2017;64:339-40.
4. Friel JK, Aziz K, Andrews WL, Harding SV, Courage ML, Adams RJ. A Double-Masked, Randomized Control Trialof Iron Supplementation in Early Infancy in Healthy Term Breast-Fed Infants. J Pediatr. 2003;143:582-6.
5. Dewey KG, Domellöf M, Cohen RJ, Rivera LL, Hernell O, Lönnerdal B. Iron Supplementation Affects Growth and Morbidity of Breast-Fed Infants: Results of A Randomized Trial in Sweden and Honduras. J Nutr. 2002;132:3249-55.
6. Quinn EA. Too Much Of A Good Thing: Evolutionary Perspectives on Infant Formula Fortification in The United States and Its Effects on Infant Health. Am J Human Biol. 2014;26:10-7.
7. Uyoga MA, Karanja S, Paganini D, Cercamondi CI, Zimmermann SA, Ngugi B, Holding P, Moretti D, Zimmermann MB. Duration of Exclusive Breastfeeding Is a Positive Predictor of Iron Status in 6- to 10-Month-Old Infants in Rural Kenya. Matern Child Nutr. 2017; 13(4).
8. WHO Statement Exclusive Breastfeeding for Six Months Best for Babies Everywhere. http://www.who.int/mediacentre/news/statements/2011/breastfeeding_20110115/en/
9. Schanler RJ, Feldman-Winter L, Landers S, Noble L, Szucs KA, Viehmann L. Concerns with Early Universal Iron Supplementation of Breastfeeding Infants. Pediatrics. 2011;127(4):e1097-10.
10. Krishnaswamy S, Bhattarai D, Bharti B, Bhatia P, Das R3, Bansal D. Iron Deficiency and Iron Deficiency Anemia in 3-5 months-old, Breastfed Healthy Infants. Indian J Pediatr. 2017;84(7):505-8.
11. Baker RD, Greer FR. Diagnosis and Prevention of Iron Deficiency and Iron-deficiency Anemia in Infants and Young Children (0–3 years of age). Pediatrics. 2010;126(5):1040-50.
12. Clark KM, Li M, Zhu B, Liang F, Shao J, Zhang Y, Ji C, Zhao Z, Kaciroti N, Lozoff B. Breastfeeding, Mixed, or Formula Feeding at 9 Months of Age and the Prevalence of Iron Deficiency and Iron Deficiency Anemia in Two Cohorts of Infants in China. J Pediatr. 2017; 181:56-61.
13. Abbaspour N, Hurrell R, Kelishadi R. Review on Iron and Its Importance for Human Health. J Res Med Sci. 2014;19(2):164.