Skip to content

112

    Issue Date: January 2023

    Farm Exposures Influence Human Milk Composition and Might Reduce Allergies

    • Farm-based lifestyles are associated with lower rates of asthma and other allergies in humans.
    • Living on farms and reduced antibiotic use alter the composition of human milk.
    • Differences in modifications of milk proteins may play a role in the development of allergies.

    Across the world, shifting from a traditional to an industrialized lifestyle changes the gut microbiome of adults. This same shift in lifestyle has also been correlated with an uptick in asthma and other allergies in children—conditions that are often preceded by changes in the gut microbiome. At present, food allergies affect approximately 1 in 13—or approximately 8 percent—of children in the U.S., according to the Centers for Disease Control and Prevention. But these conditions are rarer in children born to families that follow traditional farm-based lifestyles. For instance, fewer than 1 percent of children born in Old Order Mennonite (OOM) families are diagnosed with these conditions.

    The farm-based lifestyles of OOM families benefit children, in part, because of their effects on mothers. Maternal exposure to microbes and to farm environments builds their immune response, which in turn protects children that are fed breast milk from allergies. One popular explanation for the effects of industrialization on immune health is termed the “hygiene hypothesis,” which proposes that being exposed to certain microbes in childhood helps the immune system mature.

    Still, many gaps exist in researchers’ understanding of how the environment influences the gut microbiome and immune system. Early exposure to microbes and the development of the gut microbiome could all play a part in immune health of children [1]. To elucidate the many ways in which different environments might spur—or limit—the development of immune disorders and allergies, immunologist Kirsi Jarvinen-Seppo of the University of Rochester and her colleagues embarked on a multi-year project studying mothers and children in two environments that are geographically close but vastly different from one another.

    The researchers recruited one group of participants from suburban and urban families in the Rochester, NY, area. The second group of participants were OOM families from the Finger Lakes region near Rochester. These communities live on single-family farms, have large families, and are in constant contact with pets and farm animals. They eschew cars and rely on horses and buggies for transportation. Babies are usually born at home rather than in hospitals and have minimal exposure to antibiotics. They are also breastfed for the first several months of life.

    Gut microbiome differences

    In a pilot study [2], the team compared the gut microbiome of infants born in Rochester and OOM families. They only included healthy babies who were born after 36 weeks of pregnancy and whose mothers did not have infections or known immune diseases. The researchers compared 65 OOM and 39 Rochester mother-baby pairs. They gathered stool samples from the age of 2 weeks to 6 months, as well as breast milk samples from the mothers.

    By 2 months of age, differences between the gut microbiome of Rochester and OOM infants were evident. The latter had significantly higher levels of Bifidobacterium species, particularly B. infantis, a microbe that consumes complex sugary molecules known as human milk oligosaccharides (HMOs). Previously, researchers have found B. infantis levels in infant microbiomes higher in developing countries such as Gambia and Bangladesh, where autoimmune and allergic diseases are less frequent, and lower in Western countries where rates of these conditions are higher. The researchers detected B. infantis in 31 of 44 OOM infants but in only 5 of 24 Rochester babies.

    When the researchers followed up with mothers via a phone call by the time children were three years old, they found 4 OOM children and 12 Rochester children had experienced signs of atopic disease, which includes eczema, asthma, food allergy, and hay fever. But “there was no significant association between atopic disease and presence of B. infantis,” they wrote in their paper [2].

    Farming lifestyles and breastmilk

    Breast milk is rich in HMOs, which feed specific microbial species in the infant gut microbiome. For example, they lead to the proliferation of various Bifidobacterium species. Previously, Jarvinen-Seppo and her colleagues found that the cocktail of HMOs in a mother’s milk can depend on her exposure to microbes. But whether a farming lifestyle—and the related microbial contact—could alter human milk composition was a mystery.

    The researchers collected breast milk samples from mothers that had been lactating for about two months and measured the content of HMOs, antibodies, immune-modulating chemicals called cytokines, and fatty acids. They found strong correlations between maternal use of antibiotics and the proportions of different HMOs in milk [3]. When studying correlations with allergies, the team found that the kinds of antibodies present in breast milk differed based on the mothers’ lifestyles. For example, levels of antibodies against allergens from dust mites, egg, and peanut were significantly higher among OOM mothers compared with Rochester mothers. Atopic reactions were more common among children born to the urban families, and their occurrence was associated with breast milk that had lower levels of antibodies against dust mites.

    The data suggest that maternal lifestyle and antibiotic use do influence the composition of breast milk [3]. This may have “downstream implications” for the development of a baby’s gut microbiome and immune system after birth, the authors wrote in their study [3]. But they add that larger, long-running studies are necessary to know whether the composition of breast milk in OOM families protects children from allergic disease.

    Modified milk proteins

    Approximately 70 percent of proteins in human milk bear sugary side chains critical to protein function. Although many studies have examined the roles of milk proteins, few have homed in on the variations in glycoproteins [4].

    Continuing their analysis of breast milk, Jarvinen-Seppo and her colleagues analyzed glycoproteins present in 54 human milk samples from both the Rochester and OOM populations. In a 2022 study, they reported that differences in the sugar chains on proteins correlated with a mother’s lifestyle as well as whether they or their children suffered allergies. Using 20 Rochester and 34 OOM samples, the researchers found 38 glycoproteins in breast milk that were markedly different between the two groups, including immunoglobulin A1 [4].

    But the researchers didn’t just stop at comparing samples based on differences in lifestyle alone. They also grouped the samples in two other ways. In one analysis, they compared samples from children with allergies against those from children without allergies. In another, they compared samples from mothers with or without allergies. Each time, distinct patterns emerged. But across all three studies, they discovered a set of three glycoproteins that were elevated in the OOM community, in mothers without allergies, and in children without allergies. Further studies are needed to understand the importance of these three glycoproteins, but they could affect immune development in infants and also hold clues to the origins of allergies.

    References

    1. Järvinen KM, Davis EC, Bevec E, Jackson CM, Pizzarello C, Catlin E, Klein M, Sunkara A, Diaz N, Miller J, Martina CA. Biomarkers of development of immunity and allergic diseases in farming and non-farming lifestyle infants: Design, methods and 1 year outcomes in the “Zooming in to Old Order Mennonites” birth cohort study. Frontiers in Pediatrics. 2022;10.
    2. Seppo AE, Bu K, Jumabaeva M, Thakar J, Choudhury RA, Yonemitsu C, Bode L, Martina CA, Allen M, Tamburini S, Piras E. Infant gut microbiome is enriched with Bifidobacterium longum ssp. infantis in Old Order Mennonites with traditional farming lifestyle. Allergy. 2021 Nov;76(11):3489-503.
    3. Seppo AE, Choudhury R, Pizzarello C, Palli R, Fridy S, Rajani PS, Stern J, Martina C, Yonemitsu C, Bode L, Bu K. Traditional farming lifestyle in Old Order Mennonites modulates human milk composition. Frontiers in Immunology. 2021 Oct 11;12:741513.
    4. Holm M, Saraswat M, Joenväärä S, Seppo A, Looney RJ, Tohmola T, Renkonen J, Renkonen R, Järvinen KM. Quantitative glycoproteomics of human milk and association with atopic disease. PLOS One. 2022 May 13;17(5):e0267967.

    Can Prehistoric Disease and Famine Explain the Evolution of Lactase Persistence in Europe?

    • All humans are born with the ability to digest the milk sugar lactose, but only 35% of the global population has a genetic variant that keeps the production of the lactase enzyme turned on through adulthood.
    • The continued production of lactase, called lactase persistence (LP), has long been linked to milk use, but a new study found no evidence to support this claim among prehistoric European dairy farmers.
    • The study proposes that famine and pathogen exposure better explain the evolution of LP in Europe and could potentially explain the evolution of LP in other parts of the world as well.

    Lactase persistence (LP) has been the textbook example for a genetic adaptation to the human diet for decades. But despite its renown, the evolutionary advantage of the LP phenotype—the ability of humans to digest the milk sugar lactose throughout the lifespan—is still under debate.

    “LP is the Everest of natural selection over the last 10,000 years,” explains Mark Thomas, Professor of Evolutionary Genetics at University College London. “There are a lot of different theories on why natural selection was so strong on LP, and they all relate to milk use.” The most widely discussed evolutionary scenario argues that as milk use increased among prehistoric peoples, LP individuals were more likely to survive and reproduce than lactase non-persistent (LNP, or lactose intolerant) individuals because they could reap the nutritional benefits of milk and milk-derived foods without digestive issues.

    Although this narrative sounds good on paper, a new study [1] from Thomas, his colleagues Richard Evershed and George Davey Smith from University of Bristol, and a team of over 100 researchers found no supportive evidence among prehistoric Europeans. “Milk use doesn’t explain LP selection, and that’s a real shocker,” says Thomas.

    Thomas and his team came to this startling conclusion by creating novel statistical models that incorporated the most up to date genetic and archaeological data from Europe spanning the period from 7000 BC to AD 1500 [1]. Ancient DNA (aDNA) data from more than 1,700 skeletal remains were used to estimate the frequency of the LP genetic variants (or alleles) in prehistoric European populations across time and space. If the frequency of LP alleles were increasing faster than expected by chance, it meant that the alleles were under selection. The frequency of milk use was determined from evidence for milk fat residues on potsherds (ceramic fragments). Their model included nearly 7,000 animal fat residues from over 13,000 potsherds from 554 sites across Europe. From these, they created time series showing how frequently people were using milk over the nearly 9,000 years they surveyed.

    Going against the prevailing narrative, Thomas and his team found that patterns of milk usage did not explain changes in LP allele frequencies any better than did uniform selection since the start of the Neolithic (ca. 10,000 years ago) [1]. But if milk use doesn’t account for intense selection on the LP allele, what does? Famine and pathogens, says Thomas. “Healthy people that are LNP and drink milk are unlikely to die from diarrhea,” he explains. “But if you have severe malnutrition and you drink fresh milk and get diarrhea, you have a high chance of dying.” Diarrhea caused by an inability to digest lactose would also be an issue for anyone fighting an infection, as dehydration from diarrhea increases the risk of dying from many pathogens.

    Thomas and his team predicted that selection on LP alleles would increase with greater pathogen exposure and would be greatest during times of subsistence instability. But without written records detailing crop failures or plagues, they needed to find proxies to test their hypothesis with the data at hand. For famine, they used archaeological data to determine changes in population size; if the population size was increasing over time, and then it dropped, it was likely the result of famine. For pathogen exposure, the researchers looked at changes in the density of population settlements; as people start living in more dense settlements, the risk of exposure to pathogens is more likely.

    Using data from over 110,000 radiocarbon dates from 27,000 European sites, their models found that pathogen exposure was 284 times more probable, and famine 689 times more probable than a model of constant selection to drive the selection of LP. “We absolutely could have better proxies for famine and disease,” says Thomas. “But there is no reason to think they would give numbers like these if they weren’t involved in some way.”

    This study only looked at data from prehistoric Europe, but LP independently evolved in other parts of the world as well. Is it possible that similar selective pressures of famine and disease drove the evolution of LP in Africa and Asia? “Absolutely,” says Thomas. “All of these populations were exposed to pathogens and famine. What we have is a panacea explanation that works just as well in Africa and the Middle East.” At present, there is not nearly the same amount of data on milk fats from pottery or aDNA from Africa or Asia as was available from Europe to allow for a rigorous test of their hypothesis. But Thomas is hopeful that in the future they’ll have enough data to be able to address their questions among non-European populations. In the meantime, anthropology textbooks may want to revisit their discussion of LP and milk use. “These long-held ideas about milk use and LP just don’t hold up,” says Thomas.

     

    References

    1. Evershed RP, Smith GD, Roffet-Salque M, Timpson A, Diekmann Y, Lyon M, Cramp L, Casanova EJ, Smyth J, Whelton HL, Dunne JB, et al. Dairying, diseases and the evolution of lactase persistence in Europe. Nature. 2022 Feb 16.

    Consuming Dairy Products Helps Guard against Vitamin B12 Deficiency in Older Adults

    • Researchers in Quebec, Canada, assessed the diets of elderly people in the province and found a link between how much vitamin B12 people consumed and their risk of becoming deficient in this nutrient.
    • The link was present for dairy—the more vitamin B12 from dairy that people consumed, the lower this risk—but the same could not be said for meat and poultry, which also contain vitamin B12.
    • How well vitamin B12 survives dairy pasteurization compared with meat-cooking processes and the simultaneous presence of calcium in dairy may explain the results.

    Vitamin B12 is crucial for health. If your body is not absorbing enough of it, various health problems eventually emerge, among them anemia, and neurological and psychological difficulties, such as struggling to walk or to see properly, and depression. For this reason, studies that guide people’s dietary choices towards getting enough vitamin B12 can make valuable contributions to public health. And because deficiency in this nutrient is most common among older adults, research that focuses on elderly people is of special relevance.

    Deficiency in vitamin B12 has been labelled a worldwide problem [2]. It tends to be more common among vegans and people who suffer from certain medical conditions, especially those of the digestive system [3]. The reason that older people struggle more to provide their bodies with enough B12 is thought to be because of greater difficulty in adsorbing it from their gut (specifically, a condition called atrophic gastritis-associated food-cobalamin malabsorption is more common among older adults [4]). Although only 6% of people under 60 years of age in the United States and United Kingdom are deficient in this vitamin, the proportion is almost 20% among people above 60 years old in the same populations [3].

    That public health needs to better understand how older adults can improve their vitamin B12 levels motivated the recent study by He Helen Huang of the University of Sherbrooke in Sherbrooke, Quebec, and her colleagues. The study is unusual not only because it focuses on the age group of greatest concern, but because it isolated with unusual certainty the contribution of different kinds of foodstuffs to elderly people’s levels of vitamin B12 absorption. This was possible given Canada’s restrictions on vitamin B12-fortification of common foods, rules that do not exist in the United States, for example [1]. In this sense, the study is the first of its kind. And the upshot is that the associations identified can be treated with greater confidence than those in many of the studies that have gone before. Québécois who said they took vitamin B12 supplements were excluded from the study.

    The research drew on data from the Quebec Longitudinal Study on Nutrition and Successful Aging, which, for four years, tracked adults between 67 and 84 years of age living in the Montreal, Laval, and Sherbrooke areas of Quebec. The amounts of vitamin B12 that the study participants consumed were assessed by selecting a day each year, and, in a structured interview, asking them to recall all of the different kinds and amounts of food and drink they had consumed in the past 24 hours. From this, the research team could calculate how much vitamin B12 each individual typically consumed, and how much of the vitamin they received from different food groups. The researchers reported that 17.6% of participants had a total vitamin B12 intake below the estimated required amount, according to the Institute of Medicine in Washington D.C.

    The levels of vitamin B12 that participants had actually absorbed were measured in two ways, given that researchers who study vitamin B12 are yet to agree upon a standard measure. First, the research team collected blood samples to assess the vitamin B12 in serum. Although this is a widely used measure, only about a fifth of the total amount of vitamin B-12 in blood serum is available for cellular uptake [1]. Therefore, from measurements in urine samples, they also calculated the ratio of methylmalonic acid (MMA) to creatinine (a waste product from muscle metabolism). MMA builds up (and is excreted as it does so) in people who are deficient in vitamin B12. For robustness, the team used both of these measures to identify individuals with low levels of the nutrient.

    The results of their study squarely make the case that it is not only the total amount of vitamin B12 that you eat that matters for avoiding nutrient deficiency, but the sources from which you get it matter, too.

    Dividing the possible vitamin B12 dietary sources into the categories “dairy products,” “meat, poultry, and organ meats,” and “fish and shellfish” led to simple public health insights. The more vitamin B12 that people consumed from dairy—in a classic dose-dependent response—the lower their likelihood of having both low levels of the vitamin in their blood serum and an elevated indicator of vitamin deficiency (i.e. the MMA:creatinine ratio of their urine). But the same could not be said for the other food categories. Whether study participants ate a lot or a little vitamin B12 in “meat, poultry and organ meats” showed no relationship with their vitamin B12 status in all of the statistical models that the research team ran.

    These findings are largely in tune with other public health studies into vitamin B12, though, as previously mentioned, this research presents a uniquely clean assessment of the relationship because the participants did not eat vitamin B12-fortified foods. One study from Norway using data from around 18,000 men and women in the county of Hordaland in the west of the country, found the same result for meat consumption [5]. Two further studies from the United States also support this point [6,7]. Only one study, drawing on a cohort in The Netherlands, suggests that eating meat can raise your vitamin B12 levels [8], which Huang et al. [1] argue may reflect the much lower levels of meat consumed in that country. All of these studies associated dairy intake with healthier vitamin B12 absorption. And research into the question from Finland has also found that vitamin B12 deficiency is 2.3 times as likely among older adults who say they avoid milk products, compared with those who do not [9].

    The reason the body seems especially adept at using the vitamin B12 in dairy products may largely be because of the heat treatment applied as these foods are produced and prepared. Dairy products are typically pasteurized, which involves warming to kill potential germs, but the temperatures involved in that process are far below those of a hot oven, or the flames of a grill, on which meats and fish are often cooked. Hence the vitamin B12 that you eat or drink from dairy products is far more likely to be in the form of complete chemical structures, rather than molecules partially degraded by the cooking process. The water content of dairy products may help absorption, too. And a further but potentially quite important reason the vitamin B12 in dairy is so bioavailable, relative to other sources, is that it is delivered alongside calcium ions, which are themselves necessary for the nutrient’s absorption process [10].

    For such a widespread public health concern, with potentially profound effects, the findings of the Quebec study may give pause for thought for policymakers. Fortification of various kinds of foods with vitamin B12 may be generally in the service of public health (even though debate exists around appropriate levels of the nutrient [1]), but the extent of fortification is not a straightforward guide to how much consumers will gain from eating the products, since bioavailability varies so widely across foodstuffs. That said, the core public health messaging from the Quebec study is straightforward. Building upon the evidence from research that comes before it, the results clearly show that older adults benefit from ensuring that they have dairy in their diet.

     

    References

    1. Huang H.H., Cohen A.A., Gaudreau P., Auray-Blais C., Allard D., Boutin M., Reid I., Turcot V., Presse N. 2022. Vitamin B-12 Intake from Dairy but Not Meat Is Associated with Decreased Risk of Low Vitamin B-12 Status and Deficiency in Older Adults from Quebec, Canada. Nutr. 152(11): 2483–92.
    2. Stabler S.P., Allen R.H. 2004. Vitamin B12 Deficiency as a Worldwide Problem. Rev. Nutr. 24: 299-326.
    3. Langan R.C., Goodbred A.J. 2017. Vitamin B12 Deficiency: Recognition and Management. Fam. Physician. 96(6): 384-89.
    4. Wong C.W. 2015. Vitamin B12 Deficiency in the Elderly: Is It Worth Screening? Hong Kong Med J. 21(2): 155-64.
    5. Vogiatzoglou A., Smith A.D., Nurk E., Berstad P., Drevon C.A., Ueland P.M., Gjesdal C.G., Bjelland I., Tverdal A., Tell G.S., Nygård O., Vollset S. E. 2009. Dietary Sources of Vitamin B-12 and their Association with Plasma Vitamin B-12 Concentrations in the General Population: the Hordaland Homocysteine Study. J. Clin. Nutr. 89(4): 1078–87.
    6. Kwan LL, Bermudez OI, Tucker KL. 2002. Low Vitamin B-12 Intake and Status are More Prevalent in Hispanic Older Adults of Caribbean Origin than in Neighborhood-matched Non-Hispanic Whites. Nutr. 132(7): 2059–64.
    7. Tucker K.L., Rich S., Rosenberg I., Jacques P., Dallal G., Wilson P.W., Selhub J. 2000. Plasma Vitamin B-12 Concentrations Relate to Intake Source in the Framingham Offspring Study. J. Clin. Nutr. 71(2): 514–22.
    8. Brouwer-Brolsma E.M., Dhonukshe-Rutten R.A., van Wijngaarden J.P., Zwaluw N.L., Velde N., de Groot L.C. 2015. Dietary Sources of Vitamin B-12 and their Association with Vitamin B-12 Status Markers in Healthy Older Adults in the B-PROOF Study. 7(9): 7781–97.
    9. Lindenbaum J., Healton E.B., Savage D.G., Brust J.C., Garrett T.J., Podell E.R., Marcell P.D, Stabler S.P. Allen R.H. 1988. Neuropsychiatric Disorders Caused by Cobalamin Deficiency in the Absence of Anemia or Macrocytosis. Engl. J. Med. 318(26): 1720–8.
    10. Kozyraki R., Cases O. 2013. Vitamin B12 Absorption: Mammalian Physiology and Acquired and Inherited Disorders. Biochimie 95(5): 1002–7.

    Gut Microbiome-targeting IgG Antibodies in Maternal Milk Protect Newborn Mice

    • IgG antibodies against commensal gut bacteria are present in mother’s milk and help newborns combat infections.
    • Boosting levels of IgG antibodies in breastmilk confers additional protection to offspring.
    • A study in mice suggests that the transfer of IgG antibodies in milk is even more important than transfer via the placenta.
    • Future studies could lead to new ways to reduce infant mortality from gastrointestinal diseases such as diarrhea.

    Maternal milk is a potent cocktail of food and medicine: in addition to nutrients such as sugars and fats, it also carries antibodies that protect infants from various infections.

    Some of those antibodies target members of the gut microbiome. These bacteria are harmless and even beneficial, but—if unleashed into the bloodstream—can cause serious infections. Immunologist Melody Zeng of Weill Cornell Medicine in New York was one of the first researchers to discover that healthy animals carry antibodies belonging to the IgG class to protect against such disease [1]. In a new study, Zeng and her colleagues report that these antibodies are also transferred via maternal milk to newborn mice and play an important role in the developing gut microbiome of neonates [2]. “We knew the gut microbiome can induce IgG antibodies,” Zeng said in an interview. “But we didn’t know they could be protective in newborns.”

    Healthy adult animals typically develop these protective IgG antibodies when intestinal infections such as in food poisoning damage the intestinal lining and allow gut bacterial antigens to slip into the bloodstream. Newborn animals lack these exposures and thus need rotective antibodies.

    Zeng and her team began the new study by genetically engineering mice so that they lacked a cell surface protein named FcRn that transports IgG antibodies from mother to offspring.

    The researchers infected 18-day-old mice with a pathogen named Citrobacter rodentium, which causes an infection that mimics human infections caused by pathogenic strains of E. coli. About a week later, they found lower amounts of bacteria in control animals that had the FcRn protein. But the pathogen was still abundant in the engineered mice that didn’t receive maternal IgG antibodies because they lacked the FcRn protein. Two weeks after infection, all the mutant mice died, in contrast to almost complete protection in the IgG-rich control animals, the authors wrote in their paper.

    IgG antibodies are passed down from mother to offspring in two ways: by transfer across the placenta before birth and through ingestion of maternal milk after birth. To confirm that the effects they were seeing were due to maternal milk—and not antibodies received in the womb—the team cross-fostered newborn pups. Mice born to control animals were placed with mothers lacking the FcRn gene and vice versa. This way, engineered pups that had not received IgG antibodies in utero were placed with mothers that provided them the antibodies in milk. On the other hand, pups that received antibodies via the placenta would no longer get them in maternal milk.

    When the researchers infected the newborns with the same pathogen, all the engineered pups nursed by wild type animals survived the infection, but 70% of wild-type animals that were not receiving protective IgG antibodies in milk succumbed to the infection. The data suggest that IgG transfer after birth “accounts for most of the immune protection” against intestinal bacteria in newborns, the authors stated. The data reveal that supplying IgG antibodies to newborns through maternal milk can protect them against enteric infections.

    The team also found that immunizing female mice with a common bacterial protein increased the protective effects of maternal milk. When pups were infected with a high dose of C. rodentium, all the animals nursed by non-immunized mothers died compared with only ~20 percent of those with immunized mothers.

    Gastrointestinal infections among adult humans can cause mild or severe illness, but such infections are a leading cause of mortality among infants globally and pose a greater threat to infants born preterm [3]. Finding ways to boost immunity so as to reduce the risk of enteric infections among newborns could help address this problem, Zeng said in an interview. If further studies establish that these protective antibodies work in a similar way in humans, researchers could use that “to further protect newborns, especially those at risk of bacterial infections or sepsis,” she added.

     

    References

    1. Caballero-Flores G, Sakamoto K, Zeng MY, Wang Y, Hakim J, Matus-Acuña V, Inohara N, Núñez G. Maternal immunization confers protection to the offspring against an attaching and effacing pathogen through delivery of IgG in breast milk. Cell Host & Microbe. 2019 Feb 13;25(2):313-23.
    2. Sanidad KZ, Amir M, Ananthanarayanan A, Singaraju A, Shiland NB, Hong HS, Kamada N, Inohara N, Núñez G, Zeng MY. Maternal gut microbiome–induced IgG regulates neonatal gut microbiome and immunity. Science Immunology. 2022 Jun 10;7(72):eabh3816.
    3. Liu L, Hill K, Oza S, Hogan D, Chu Y, Cousens S, Mathers C, Stanton C, Lawn J, Black RE. Levels and causes of mortality under age five years. Reproductive, Maternal, Newborn, and Child Health. 2016 May 27;11:71.

    Back to SPLASH!® Home