Skip to content

88

    Issue Date: July 2019

    High-Fat Dairy Linked to Lower Diabetes Rates in Native Americans

    • The Strong Heart Family Study followed diagnoses of diabetes in 12 native communities in the United States over more than a decade.
    • The study found that participants who consumed the most high-fat milk and cheese were less likely to develop diabetes than those who consumed little of these products.
    • There was no association between eating and drinking low-fat dairy products and diabetes rates, however.

    Native Americans are about twice as likely as white people in the United States to develop diabetes, and more likely to do so than any other ethnic group in the country. The reasons for this are complex, but post-reservation lifestyles and diets packed with processed sugar and saturated fats are big contributors. Given the extent of the problem, any research that identifies cheap interventions to which many people are likely to be amenable has the potential to reap substantial public health benefits. In a recent issue of the Journal of Nutrition, Kim Kummer of the University of Washington in Seattle, and colleagues report that encouraging the consumption of full-fat dairy products, such as full-fat milk and cheese, could be a useful tool in efforts to cut the disease burden [1].

    The high rates of Type 2 diabetes found in native communities are not merely a problem among older people. The Centers for Disease Control (CDC) reports that 10–19-year olds among groups that they refer to as American Indians and Alaska Natives are nine times as likely to be diagnosed as non-Hispanic whites [2]. Publications from a civic organization called the First Nations Development Institute detail the spread of diabetes across adults in these communities [2]. Some of the lowest rates are found among Alaska Natives, with a 5.5% incidence rate. Meanwhile, some of the highest rates occur among communities in southern Arizona, where 33.5% of adults are known to have the disease.

    This exorbitant rate is probably why the Strong Heart Family Study—the name of the family-based cohort study for which Kummer and colleagues report the results—selected communities in Arizona for the research. In all, one dozen communities were involved, spread over North and South Dakota, and Oklahoma, as well. The study began in 2001, with baseline examinations over the subsequent two years of the 91 large families participating, giving a total participant pool of 1,112 men and 1,658 women. Both baseline exams and 2007 to 2009 follow-up exams included laboratory tests of before-breakfast blood glucose concentration, and an in-person consultation with a long questionnaire about participants’ dietary habits and a physical examination. The research team conducted further analysis of medical records and interviews over the phone between 2013 and 2017.

    Kummer and colleagues wanted to know whether drinking milk and eating cheese correlated with the probability of developing diabetes, controlling for other factors that are associated with the risk. The 119-item dietary questionnaire asked how frequently participants consumed these foods during the previous year, how much they tended to consume, and whether they usually selected full-fat or low-fat milk and cheese. Perhaps surprisingly to anyone who has not followed research in this field, the full-fat options were linked to lower diabetes risk, while the low-fat options were not.

    Specifically, of the total of 1,623 participants who were diabetes-free at the start of the study and who remained in the study throughout, 277 were diagnosed with diabetes by their final interview, which was on average 11 years later. Comparing the top third of dairy consumers (by amount consumed) to the bottom third yielded some striking results. When full-fat dairy intake was analyzed in this way, the rate of diabetes diagnosis during the study period was found to be 88% higher among the third who consumed the least high-fat dairy versus the third who had consumed the most, after the researchers statistically removed the effects of other known risk factors. There was no significant association between low-fat dairy consumption and diabetes.

    This is far from the first piece of research to conclude a link between full-fat dairy—as opposed to low-fat dairy—and diabetes prevention. For example, in 2016, a study of almost 10,000 middle-aged and elderly Brazilians found similar results for the development of metabolic syndrome, which is a major risk factor for developing Type 2 diabetes [3]. Authors Michele Drehmer of the Federal University of Rio Grande, in the southern city of Porto Alegre, and colleagues, concluded that, “Dietary recommendations to avoid full-fat dairy intake are not supported by our findings.” However, other scholars have not found any relationship [4], and others still have found that low-fat dairy is preferable for those concerned about their diabetes risk [5].

    Making sense of these conflicting results is tricky. In addition to adding to the weight of the evidence on both sides, as this study of native communities has done, probing the biochemical mechanisms that appear to be involved is a means to push forward understanding of the links between dairy and diabetes. One line of investigation in this regard is the role of certain fatty acids, such as trans-palmitoleic acid, butyric acid and phytanic acid, which are thought to lower the levels of triglycerides in the liver, and in doing so, improve sensitivity to insulin [1]. Elucidating exactly how these fatty acids operate, and how their absorption and metabolism may vary across populations may eventually also help to curb diabetes rates among at-risk groups, such as native communities.

    References

    1. Kummer K., Jensen P.N., Kratz M., Lemaitre R.N., Howard B.V., Cole S.A. & Fretts A.M. 2019. Full-Fat Dairy Food Intake is Associated with a Lower Risk of Incident Diabetes Among American Indians with Low Total Dairy Food Intake. J. Nutr. 0:1–7.

    2. Type 2 Diabetes in Native Communities. Fact Sheet. First Nations Development Institute. Available here: https://www.firstnations.org/wp-content/uploads/publication-attachments/5%20Fact%20Sheet%20Type%202%20Diabetes%20In%20Native%20Communities%20FNDI.pdf

    3. Drehmer M., Pereira M.A., Schmidt M.I., Alvim S., Lotufo P.A., Luft V.C. & Duncan B.B. 2016. Total and Full-fat, but Not Low-fat, Dairy Product Intakes are Inversely Associated with Metabolic Syndrome in Adults. J. Nutr. 146(1):81–9.

    4. Chen M., Sun Q., Giovannucci E., Mozaffarian D., Manson J.E., Willett W.C. & Hu F.B. 2014. Dairy Consumption and Risk of Type 2 Diabetes: 3 Cohorts of US Adults and an Updated Meta-analysis. BMC Med. 12:215.

    5. Aune D., Norat T., Romundstad P. & Vatten L.J. 2013. Dairy Products and the Risk of Type 2 Diabetes: A Systematic Review and Dose-response Meta-analysis of Cohort Studies. Am. J. Clin. Nutr. 98(4):1066–83.

    Dairy Cattle Resistant to Tuberculosis

    • Gene “editing” was used to insert an extra copy of a natural cattle gene into the genome of dairy cattle.
    • The gene “edited” cattle showed enhanced resistance to bovine tuberculosis.
    • The routine presence of the tuberculosis-resistant cattle in production herds is promising but still a long way off.

    Infectious diseases are not conquered, but sometimes that’s our perception. The infectious microbial agents patiently await the right opportunity occurring at the intersection of multiple circumstances. Their unpredictability is their modus operandi, which often amplifies their adverse impacts.

    Scientists who study the incidence and spread of infectious diseases in animal populations (epidemiologists) emphasize that strategies aimed at preventing the establishment of disease-causing microbes in individuals are much more effective at disease control in a population compared with interventions designed to contain the spread of the disease once established [1]. The latter strategy is like shutting the barn door after the horse has bolted. Other groups of scientists suggest that enhancing the biological resistance of animals to disease-causing bacteria and viruses could greatly assist in the prevention of disease outbreaks [2-4].

    The natural resistance of an animal to infectious disease is determined by advantageous natural genetic variations, previous disease history, and adequate nutrition [5]. Investigators now report a research breakthrough that demonstrates how biological resistance of cattle to bovine tuberculosis can be further enhanced by using genetic “editing” technology [3].

    It is still very early days, but the new gene “editing” approach to increasing livestock resistance to disease has enormous potential for decreasing the incidence of infectious diseases in livestock and thereby enhancing animal welfare, increasing industry productivity, and decreasing reliance on antibiotics [2-4, 6]. The routine presence of these disease-resistant animals in our food production system is a long way in the future. There remain substantial scientific and regulatory hurdles, and challenges with public acceptance [7, 8]. Still, the first glimpses of what may be achieved in the future are starting to appear [3, 4]. One early example is the production of dairy cattle that are resistant to bovine tuberculosis [3].

    Infectious Diseases Always Threaten

    History repeatably and starkly reminds us that infectious diseases are major threats to the human food supply system. This is a lesson hard won, but worryingly it fades with time when there are no visible and imminent threats. The absence of notifiable disease often signals to government agencies, the public, and commercial interests an opportunity to curtail expensive disease surveillance and preventative measures. Inevitably this response increases the risk of infectious disease reoccurrence. The human response to the containment of food supply epidemics is typically a scorched earth policy; detection, isolation, and destruction. It’s brutally pragmatic but effective. Another approach championed by scientists is the enhancement of disease resistance in livestock to decrease the risk of infectious disease outbreaks.

    Bovine Tuberculosis

    Epidemiologists from the USDA and OIE (World Organization for Animal Health) report that bovine tuberculosis in cattle herds is well controlled in most first world countries, although there are sporadic outbreaks [9, 10]. This chronic respiratory disease is caused by the bacterium called Mycobacterium bovis, and can eventually lead to cattle death [10]. There is no vaccine in use in the livestock industries [11, 12]. M. bovis spreads quickly to uninfected animals, even before appearance of the first clinical signs of disease, and therefore disease detection in an individual usually means the culling of the entire livestock herd, both the infected and the healthy [10]. Alternatively, expensive antibiotics are used, but this approach is now complicated by the resistance of M. bovis to some antibiotics [13]. It’s also difficult to eliminate bovine tuberculosis from cattle populations as it is present in many species of wild animals. Hence, existing and effective bovine tuberculosis containment strategies rely on stringent surveillance, rapid detection, and culling of infected animals [10].

    Scientists at the Centers for Disease Control indicate that M. bovis can also be transmitted to humans where it causes chronic tuberculosis-like symptoms [13]. However, most cases of human tuberculosis are caused by the related bacterium, Mycobacterium tuberculosis. Startlingly, WHO epidemiologists concluded that 23% of the world’s human population is infected with M. tuberculosis, although only a “minority” of people develop the active disease (a mere 10 million new human cases and 1.6 million deaths in 2017 [14]). They report that M. tuberculosis infection of humans is still the biggest infectious disease challenge in the world today [14]. It is particularly prevalent in second and third world countries. By contrast, the incidence of M. bovis infection in humans is relatively low. The WHO epidemiologists also estimated that the worldwide incidence M. bovis infection in humans was 142,000 cases with 12,500 deaths in 2017 and by far, most of the M. bovis infection of humans occurs in Africa and Asia [14]. As a side note, the reason for pasteurization of cow’s milk, beginning close to 100 years ago, was to kill disease-causing microbes, especially M. bovis, to ensure the safety of milk for human consumption [15]. The pasteurization of milk is a major human health success story.

    Dairy Cattle Resistant to Tuberculosis

    Recently, a group of investigators based at Northwest A&F University in Shaanxi, China used gene “editing” technology to produce cows that were more resistant to bovine tuberculosis [3]. The group was led by Dr. Yong Zhang and their research results were published in Genome Biology. Yuanpeng Gao and Haibo Wu were the first authors on the publication. Gene “editing” technology is the ability to change the genetic material or DNA in an organism by insertion, deletion or altering the DNA at a precise predetermined location in the genome (the genome is the full complement of DNA) [6]. The specific molecular technology used by the investigators was CRISPR/Cas9n—it’s a mouthful, but essentially it is like using a pair of sharp scissors to add a very small patch on clothing, and then sew it up like new. The trick is to know what type of patch is needed, precisely where the patch goes, and ensure that the normal function of the clothing is unchanged. The intention is a more comfortable fit for the environment where the clothing will be used.

    The investigators used the gene “editing” technology to precisely insert an extra copy of the naturally occurring bovine gene called NRAMP1 into the genome of female fetal skin cells from a dairy cow. The additional NRAMP1 gene was obtained from a Holstein-Friesian cow. The gene codes for a protein that is associated with resistance to various microbial infections in multiple mammalian species, including M. bovis infection in cattle [16-18]. The protein is thought to be part of the first line of defense of the body against microbial invasions. The investigators’ rationale was that the additional copy of NRAMP1 would further enhance the cow’s natural resistance to bovine tuberculosis.

    Importantly, the investigators spent a lot of effort in finding the best spot in the genome for the NRAMP1 gene insertion. This effort was required to ensure that the gene insertion did not interfere with other genes near the insertion site, and it greatly decreased the chance of additional unwanted insertions of the gene elsewhere in the genome. You don’t want to put a patch over a button or randomly add multiple patches on clothes.

    The DNA present in nuclei of the bovine skin cells, and now carrying a second copy of NRAMP1, was then introduced into egg cells (ova) from a cow. After a lot of careful laboratory coaxing, cloned embryos were grown and transferred into surrogate cows for the duration of the pregnancies. A total of 11 calves were born and survived beyond three months of age. Each calf contained one extra copy of NRAMP1 at precisely the intended genomic location. There were no unplanned insertions of this gene elsewhere in the genome. Moreover, the additional NRAMP1 gene was only active in specific white blood cells, just like the existing NRAMP1 gene, and it did not interfere with the activities of other genes near the insertion site in the genome. The investigators’ precision of gene “editing” was a major feature of the research. Careful planning paid off!

    Gao and colleagues noted that the 11 calves were seemingly normal healthy animals [3]. The investigators then performed a series of tests on blood cells taken from these animals as well as untreated calves with only one copy of NRAMP1. The investigators demonstrated that the blood cells taken from the calves with the additional NRAMP1 gene were better able to suppress M. bovis infection compared with blood cells taken from calves with only one copy of NRAMP1. Moreover, when the investigators directly infected calves with M. bovis, the calves with two copies of the gene were better at resisting the infection compared with calves without the extra copy of NRAMP1. The implication was that the calves that received the extra copy of NRAMP1 didn’t get sick, and they eliminated or suppressed the M. bovis infection. An independent scientist commented that this conclusion still needed more formal clinical proof [19], while another scientist suggested that the approach was currently not sufficiently practical for use in the livestock industry [20]. It is clear, however, that the research of Gao and colleagues is a first step on a long pathway toward the production of livestock animals with enhanced resistance to disease.

    Implications

    The research of Gao and colleagues is a major scientific achievement with wide implications for improving disease control, animal welfare, and industry productivity [3]. Their research is still only a very early and tentative step toward the long-term practical goal of producing disease-resistant cattle for use in the dairy industry; there is still much science that remains to be undertaken. Slow and steady is the current course, but now there are good navigational tools to ultimately travel to the distant destination.

    “It’s always an uncertainty. We’re always at the infectious disease roulette table” – William Schaffner.

    References

    1. Heymann DL. Prevention is better than cure for emerging infectious diseases. Brit Med J. 2014;348:g1499.

    2. Tait-Burkard C, Doeschl-Wilson A, McGrew MJ, Archibald AL, Sang HM, Houston RD, et al. Livestock 2.0 – genome editing for fitter, healthier, and more productive farmed animals. Genome Biol. 2018;19(1):204-214.

    3. Gao Y, Wu H, Wang Y, Liu X, Chen L, Li Q, et al. Single Cas9 nickase induced generation of NRAMP1 knockin cattle with reduced off-target effects. Genome Biol. 2017;18(1):13-27.

    4. Burkard C, Lillico SG, Reid E, Jackson B, Mileham AJ, Ait-Ali T, et al. Precision engineering for PRRSV resistance in pigs: Macrophages from genome edited pigs lacking CD163 SRCR5 domain are fully resistant to both PRRSV genotypes while maintaining biological function. PLoS Pathog. 2017;13(2):e1006206.

    5. Bishop SC, Woolliams JA. Genomics and disease resistance studies in livestock. Livest Sci. 2014;166:190-198.

    6. Lamas-Toranzo I, Guerrero-Sánchez J, Miralles-Bover H, Alegre-Cid G, Pericuesta E, Bermejo-Álvarez P. CRISPR is knocking on barn door. Reprod Domest Anim. 2017;52 Suppl 4:39-47.

    7. Carroll D, Van Eenennaam AL, Taylor JF, Seger J, Voytas DF. Regulate genome-edited products, not genome editing itself. Nat Biotech. 2016;34:477-479.

    8. Van Eenennaam AL, Wells KD, Murray JD. Proposed U.S. regulation of gene-edited food animals is not fit for purpose. NPJ Sci Food. 2019;3(1):3-9.

    9. OIE. Bovine tuberculosis 2019 [Available from: www.oie.int/en/animal-health-in-the-world/animal-diseases/bovine-tuberculosis/].

    10. United States Department of Agriculture. National tuberculosis eradication program: United States Department of Agriculture – Health and Inspection Service; 2018 [Available from: https://www.aphis.usda.gov/aphis/ourfocus/animalhealth/animal-disease-information/cattle-disease-information/national-tuberculosis-eradication-program].

    11. Conlan AJ, Brooks-Pollock E, McKinley TJ, Mitchell AP, Jones GJ, Vordermeier M, et al. Potential benefits of cattle vaccination as a supplementary control for bovine tuberculosis. PLoS Comput Biol. 2015;11(2):e1004038.

    12. Perez-Casal J, Prysliak T, Maina T, Suleman M, Jimbo S. Status of the development of a vaccine against Mycoplasma bovis. Vaccine. 2017;35(22):2902-2907.

    13. CDC. Mycobacterium bovis (bovine tuberculosis) in humans: Centers for Disease Control and Prevention. 2011 [Available from: https://www.cdc.gov/tb/publications/factsheets/general/mbovis.htm].

    14. WHO. Global tuberculosis report Geneva: World Health Organization; 2018 [Available from: https://www.who.int/tb/publications/global_report/en/].

    15. Boor KJ, Wiedmann M, Murphy S, Alcaine S. A 100-Year Review: Microbiology and safety of milk handling. J Dairy Sci. 2017;100(12):9933-9951.

    16. Vidal S, Tremblay ML, Govoni G, Gauthier S, Sebastiani G, Malo D, et al. The Ity/Lsh/Bcg locus: natural resistance to infection with intracellular parasites is abrogated by disruption of the Nramp1 gene. J Exp Med. 1995;182(3):655-666.

    17. Liu Y, Zhao E, Zhu L, Zhang D, Wang Z. 3’UTR polymorphisms in NRAMP1 are associated with the susceptibility to pulmonary tuberculosis: A MOOSE-compliant meta-analysis. Medicine (Baltimore). 2019;98(23):e15955.

    18. Wu L, Deng H, Zheng Y, Mansjö M, Zheng X, Hu Y, et al. An association study of NRAMP1, VDR, MBL and their interaction with the susceptibility to tuberculosis in a Chinese population. Int J Infect Dis. 2015;38:129-135.

    19. Briggs H. “Tuberculosis-resistant” cattle developed in China. 2017 [Available from: https://www.bbc.com/news/science-environment-38810073].

    20. Fernandez C. World’s first tuberculosis-resistant cows are created in China using ‘cut and paste’ gene editing technique Australia2017 [Available from: https://www.dailymail.co.uk/sciencetech/article-4179452/Tuberculosis-resistant-cows-developed-time.html].

    Fossil Teeth Tell Story of Neanderthal Life and Lactation

    • As teeth grow they permanently incorporate molecules from food and liquids an individual consumes.
    • In 2013, a team of researchers demonstrated that changes in the concentration of barium in enamel corresponded with changes in breast milk intake, including the timing of weaning.
    • The barium method was applied in a more recent study to two molars from Neanderthal children dated to 250,000 years ago, and suggested an age at weaning of approximately 2.5 years, similar to that of modern humans.
    • The new study also analyzed the ratio of two different oxygen isotopes in the molar enamel, allowing for the first ever description of the seasons of birth and weaning in a human ancestor.
    • These types of dental analyses allow paleoanthropologists to recreate individual life stories for Neanderthals, and potentially other human ancestors, enriching our understanding of human evolutionary history.

    New parents use baby books to record the dates of all of their child’s firsts—when they first eat solid foods, take their first steps, cut their first tooth, and say their first words. These books tell part of the child’s life story, allowing parents to reminisce years later about when all of these exciting milestones happened in the life of their child. Researchers that study the evolutionary history of humans are similarly interested in knowing the dates of these developmental milestones in order to recreate life stories for fossil skeletons (albeit for the less sentimental reason of comparing to living humans). Amazingly, teeth—even those from individuals that died hundreds of thousands of years ago—act much like doting parents, capturing every day of childhood as they grow.

    This Will Go Down on Your Permanent Record

    Teeth are the only part of the skeleton that interacts directly with the environment; the dentine (the tissue making up the inner core of a tooth crown) and enamel (the tissue making up the outer core of tooth crown) that are secreted daily during tooth growth incorporate molecules from food and water. And because dentine and enamel never remodel, these environmental interactions are permanent. (This permanence is great for paleontologists who study them thousands of years later but not so great for someone with damaged enamel.)

    In 2013, Dr. Tanya M. Smith, Professor at the Australian Research Centre for Human Evolution at Griffith University, and her colleagues demonstrated that in both living humans and rhesus macaque monkeys, the levels of barium (relative to calcium) in teeth reflected known dates of dietary changes across infancy, nearly to the day [1].

    “Barium is a trace element that passively follows calcium,” explains Smith. “It gets locked into bones and teeth, save for prenatally when it is blocked by the placenta.” As a result, the first appearance of barium in the teeth corresponds with the introduction of mother’s milk. Smith and her team found that barium levels remained elevated in teeth until the introduction of solid foods, at which point they slowly decreased, but remained moderate, as mother’s milk made up less and less of the infant diet. The signature for cessation of nursing corresponded to the point where barium levels first reached their minimum, or prenatal, level [1].

    Dental Time Machine

    Having established the validity of this barium timeline in living individuals, the team applied the same methods to a fossil tooth of an 8-year-old Neanderthal child from Belgium that died approximately 125,000 years ago [1]. Did the Neanderthal have a life history pattern like that of modern humans, weaning relatively early (sometime between 2 and 3 years of age) or were they more similar to the other living apes with a later age of weaning (chimpanzees wean between 5 and 6 years of age and orangutans between 8 and 9 years of age)? [2]. Unfortunately, this tooth was unable to answer that question. The barium levels in the Neanderthal tooth dropped abruptly, rather than gradually as would be indicative of the weaning process, when the child was approximately 14 months old. This sudden decline was interpreted as an interruption of normal nursing patterns rather than the age of natural weaning for Neanderthals [1].

    That the child survived this early nutritional interruption suggests that Neanderthals, like modern humans, could wean at 14 months of age. But the differences in the barium signature between the Neanderthal tooth and that of living children known to wean more gradually strongly suggested that this tooth more likely represented an exception among Neanderthals rather than the rule.

    Without a definitive answer, the team looked for more teeth from our evolutionary cousins to analyze. But unfortunately, this type of research is not as simple as walking into a museum, looking at a tooth, and returning it intact. “Curators are charged with protecting the fossils in their care and are typically loathe to allow someone to cut a tooth in half to remove a tiny slice,” says Smith. When given access to two molars from Neanderthal children from the French site of Payre, Smith and her team made the analyses worth the damage they inflicted on the 250,000 year old teeth [8]. In addition to looking at barium to answer questions about the duration of lactation among Neanderthals, they also took on the time-consuming task of quantifying the ratio of two different oxygen isotopes, which would tell them about paleoclimate [3].

    Alexa, What’s The Weather in The Middle Pleistocene?

    Using a tool called a sensitive high-resolution ion microprobe, or SHRIMP­—aptly named considering it was housed at the Australian National University—Smith and her team measured the amount of oxygen with 18 atoms (the heavy variant) compared with oxygen with 16 atoms (the light variant) in each of the molars [3]. During warm periods, surface water is higher in the heavy variant; during cool periods, the light variant is more plentiful. As individuals drink water (or breast milk) from their environment, the growing teeth incorporate different proportions of each type of oxygen and provide a Pleistocene weather report that rivals modern day meteorological forecasts in accuracy.

    “We are sampling a few days of tooth formation every week or so,” explains Smith. “My error estimates suggest that we are able to time this accurately to within a few weeks of the actual ages in primates of known histories.” From a 250,000-year-old child’s tooth, Smith and her team are able to say what the climate was like when the child was born and when they were weaned, within a few weeks. For some perspective, oxygen isotope data from deep-sea cores capture climate patterns over thousands of years, and only allow paleoanthropologists to speak to whether a fossil specimen lived during a cool or warm period. In contrast, the oxygen isotope layers from teeth allow Smith and colleagues to pinpoint events in the individual’s lifetime to particular seasons.

    One of the children (Payre 6) was born in the spring and weaned approximately 2.5 years later in the fall. (Unfortunately, the other child’s tooth could not be analyzed for barium due to contamination from surrounding soils.) Is 2.5 years a more likely age of weaning for Neanderthal infants? It is impossible to generalize from one individual to an entire species (just imagine selecting one living human to represent the age of weaning for infants today), but the similarity to the natural age of weaning among nonindustrial modern humans is intriguing. Humans are the odd man out across the apes when it comes to age of weaning; knowing that Neanderthals match us, more so than other living apes, tells us that they might have a life history pattern more like us as well (e.g., same age at first reproduction, same interbirth interval).

    Who Tells Your Story

    Smith ended a recent write-up [4] of this research project with the subtitle “More Teeth Needed,” and that is a pretty succinct way of discussing the next steps in this field of study. Hopefully, more and more collections managers will see the value in sacrificing teeth in order to reveal more life stories of human ancestors. So much of paleoanthropology is inference from a small handful of individuals (of which many are incomplete skeletons) to an entire species. What a quantum leap to being able to say—with precision!—what an individual actually experienced during their lifetime, and then aggregating those experiences to describe a population.

    References

    1. Austin C, Smith TM, Bradman A, Hinde K, Joannes-Boyau R, Bishop D, Hare DJ, Doble P, Eskenazi B, Arora M. 2013. Barium distributions in teeth reveal early-life dietary transitions in primates. Nature, 498(7453): 216.

    2. Smith TM, Austin C, Hinde K, Vogel ER, Arora M. 2017. Cyclical nursing patterns in wild orangutans. Science Advances, 3(5), p.e1601517.

    3. Smith TM, Austin C, Green DR, Joannes-Boyau R, Bailey S, Dumitriu D, Fallon S, Grün R, James HF, Moncel MH, Williams IS. 2018. Wintertime stress, nursing, and lead exposure in Neanderthal children. Science Advances, 4(10), p.eaau9483.

    4. Smith TM. 2018. What teeth can tell about the lives and environments of ancient humans and Neanderthals. The Conversation. https://theconversation.com/what-teeth-can-tell-about-the-lives-and-environments-of-ancient-humans-and-neanderthals-104923

    The Bitterness of the Maternal Diet Influences the Bitterness of Human Milk

    • The flavors infants taste early in life, such as from their mothers’ milk, are known to drive their later food preferences.
    • A new study finds that the consumption of bitter foods, such as vegetables, by mothers may influence the bitterness of their milk.
    • The study suggests that breastfeeding could help children accept bitter flavors and develop healthier food preferences later in life.

    Human milk is known to provide a variety of nutrients that aid infants’ growth and development and are beneficial to their health (1). But as children grow a little older, they often don’t meet recommended dietary guidelines, particularly when it comes to eating enough fruits and vegetables (2).

    Taste is often an important factor in determining food preferences and choices, especially in children (3,4). Studies have shown that these taste preferences can be influenced by the flavors that infants experience early in their life, such as from their mothers’ milk. The flavors in human milk can in turn be influenced by maternal diet (5).

    A new study conducted by Dimitra Mastorakou and colleagues at Danone Nutricia Research investigated the sensory properties of human milk and how they were affected by maternal diet (6). They found that the consumption of bitter foods, such as vegetables, may influence the bitterness of human milk. “The most important finding of our study was the association between the bitterness of the mother’s diet and her milk,” Dimitra Mastorakou and colleagues at Danone Nutricia Research said in an email.

    The study suggests that breastfed infants may be exposed to bitter vegetable flavors at an early age, and may thus accept these flavors more easily, leading to healthier food choices later in life. “We hypothesized that being exposed to low levels of bitterness via breastmilk could possibly contribute to bitter taste acceptance of infants, similar to the acceptance of carrots by infants after breastfeeding mothers had been eating carrots,” the researchers said (7).

    Previous studies have shown that children who were breastfed are less picky and more willing to try new foods as they grow compared with children who were fed formula (8). In addition, the overall duration of breastfeeding has been shown to predict vegetable intake (9). “However, our study was exploratory and the association that we found, as well as the possible effect of varying bitterness levels of human milk on later vegetable acceptance, need further evaluation,” the researchers said.

    The researchers initially set out to describe the sensory properties of human milk, of which there is relatively little knowledge (10,11). In particular, they looked specifically at differences in the sensory and analytical properties of fore milk—approximately the first 30 milliliters of milk expressed—and hind milk—the remaining milk expressed. “Since we know that certain sensory characteristics of breast milk can be influenced by maternal diet, we wanted to look at the relationship between the bitterness of her diet and the bitterness of human fore and hind milk, as an initial indication of the transference of the bitter taste to breast milk,” the researchers said.

    Twenty-two lactating mothers were trained on reference samples to become familiar with the five basic tastes—sweet, salty, bitter, sour, and umami—and creaminess. The mothers then kept a 24-hour food diary to assess the bitterness of their diet, followed by a sensory self-assessment of their fore and hind milks to assess how bitter they were perceived to be. The researchers also conducted analytical measurements of fore and hind milk to quantify fat, carbohydrate, total protein, and free amino acid content.

    The mothers described sweetness as being the main basic taste of human milk, and it didn’t differ significantly between fore and hind milk. Human milk’s sweetness was correlated with carbohydrate content, whereas its umami was correlated with glutamic acid content.

    The researchers found a significant positive correlation between the bitterness of the maternal diet consumed 24 hours before lactation and the perceived bitterness of fore milk but not hind milk. The bitterest foods in maternal diet were fruits and vegetables, caffeinated beverages, and cheeses, with vegetables being particularly bitter.

    It’s still unclear why the bitterness of the maternal diet was correlated with the bitterness of fore milk but not that of hind milk, and future studies will be needed to further explore this question. The researchers suggest that the answer may lie in the differences between fore and hind milk, with fore milk being significantly less creamy, less fatty, and more watery than hind milk. Previous studies have shown that fatty acids present in human milk can mask the bitterness of certain bitter solutions (12).

    The study suggests that consumption of bitter foods in the maternal diet may influence the bitterness of human fore milk, and could thus serve as an additional factor in children’s flavor learning. “Our study was exploratory but it has added to our hypothesis that the bitterness of human milk could potentially be an influencing factor on children’s later vegetable acceptance,” the researchers said.

    “The taste of human milk is difficult to study, and we hope that this paper will inspire additional research that will help to further develop the knowledge of the importance of the maternal diet during breastfeeding, taking into consideration the international nutritional recommendations for lactating mothers,” they said.

    References

    1. Ballard O., Morrow A.L. Human milk composition: nutrients and bioactive factors. Pediatr Clin North Am. 2013 Feb;60(1):49-74.

    2. Yngve A., Wolf A., Poortvliet E., Elmadfa I., Brug J., Ehrenblad B., Franchini B., Haraldsdóttir J., Krølner R., Maes L., Pérez-Rodrigo C., Sjostrom M., Thórsdóttir I., Klepp K.I. Fruit and vegetable intake in a sample of 11-year-old children in 9 European countries: The Pro Children Cross-sectional Survey. Ann Nutr Metab. 2005 Jul-Aug;49(4):236-45.

    3. Drewnowski A. Sensory preferences for fat and sugar in adolescence and adult life. Ann N Y Acad Sci. 1989;561:243-50.

    4. Drewnowski A. Sensory control of energy density at different life stages. Proc Nutr Soc. 2000 May;59(2):239-44.

    5. Mennella, J. A. 2007. The chemical senses and the development of flavor preferences in humans. Pages 403–14 in Textbook on Human Lactation. Hale Publishing, Plano, TX.

    6. Mastorakou D., Ruark A., Weenen H., Stahl B., Stieger M. Sensory characteristics of human milk: Association between mothers’ diet and milk for bitter taste. J Dairy Sci. 2019 Feb;102(2):1116-30.

    7. Mennella J.A., Daniels L.M., Reiter A.R. Learning to like vegetables during breastfeeding: a randomized clinical trial of lactating mothers and infants. Am J Clin Nutr. 2017 Jul;106(1):67-76.

    8. Galloway A.T., Lee Y., Birch L.L. Predictors and consequences of food neophobia and pickiness in young girls. J Am Diet Assoc. 2003 Jun;103(6):692-8.

    9. de Wild V.W., Jager G., Olsen A., Costarelli V., Boer E., Zeinstra G.G. Breast-feeding duration and child eating characteristics in relation to later vegetable intake in 2-6-year-old children in ten studies throughout Europe. Public Health Nutr. 2018 Aug;21(12):2320-28.

    10. Spitzer J., Doucet S., Buettner, A. The influence of storage conditions on flavour changes in human milk. Food Qual Prefer. 2010 Dec;21(8):998-1007.

    11. McDaniel M.R., Barker E., Lederer C.L. Sensory characterization of human milk. J Dairy Sci. 1989 May;72(5):1149-58.

    12. Ley J.P. Masking bitter taste by molecules. Chemosens Percept. 2008 Mar;1(1):58-77.

    Back to SPLASH!® Home