Skip to content

45

    Issue Date: December 2015 | PDF for this issue.

    Ancient Aurochs Genome Contains the DNA Blueprint for Modern Cattle

    • A bone found in an English cave contained DNA from an ancient wild ox known as the aurochs.
    • The DNA was sequenced from over 85% of the aurochs genome.
    • Ninety percent of the genetic variants identified in aurochs DNA are found in modern cattle.
    • Cattle from Britain and Ireland have retained a relatively high level of aurochs DNA sequence.
    • Following the distribution of genetic variants from aurochs to modern cattle provides a trail of domestication and specialized breed trait selection.

    A preserved specimen of aurochs bone was discovered deep beneath the Derbyshire Dales in the UK in the 1990s (1). Aurochs are an ancient cattle breed domesticated around 10,000 years ago somewhere around modern day Iran. In Europe, the last of these animals were still found on a Polish royal reserve as recently as the 17th century. Park et al., (2) have now extracted enough DNA from the ancient bone specimen to sequence the aurochs genome. When they compared the aurochs sequence to the DNA of cattle breeds we know and use in domestic agriculture today, they found a surprisingly high level in common with British and Irish cattle.

    Scientists from Britain have recovered a bone specimen from a cave deep beneath the Derbyshire countryside. The cave was recognized as an ancient burial ground and contained numerous preserved animal bones. The aurochs bone was dated as being over 6,700 years old. This was prior to the New Stone Age in Britain and during a time when Britain was connected to the European mainland. The peoples of Britain were then hunters and gatherers and, no doubt, aurochs were prized game. The first farmers apparently migrated into Britain with their livestock. However, these inhabitants probably represented less than 20% of the population, and contributed to a wider movement that saw the gradual adaptation of managed quarry (for hunting) to herding (4).

    Park et al. were able to get enough DNA from the bone specimen to use modern sequencing methods, and reconstructed over 86% of the aurochs genome sequence. This corresponded to approximately 90% of the reference genome derived from a Hereford cow. Once they had this data in their files, they compared it to the sequences from over 80 other individual genomes from different cattle. They identified the places where there was a difference, or variant, between the aurochs sequence and the reference genome. Now they could focus on these 2.1 million differences and compare across a wider range of sequences from modern cattle breeds.

    The goal for Park et al. was to use these variants as markers to track how the aurochs DNA was distributed in the DNA of these modern breeds. They first checked how many of these variants were already known to exist in cattle. They found that for over 90% of these variants were already recorded in cattle genomic databases. This meant two things. First, it confirmed that these sequence variants would be useful for tracking the extent to which various parts of the aurochs DNA was used in different cattle breeds, and second, they could infer which variants may have influenced selected traits.

    They found that aurochs were likely to be used to inter-breed with domesticated cattle at some time in the past. Most likely, farmers looked to wild cattle to boost their stocks, but there may have also been some random matings. The scientists also found that the divergence of European breeds, relative to North African and Asian breeds, may be more recent than previously thought. This was evident from an apparent recent loss of genetic variants from the European breeds rather than an ancient split. The evidence for a significant contribution of aurochs to the modern breeds found in Scotland, Wales, England and Ireland was particularly compelling. . Thus, the interbreeding of aurochs and domestic European cattle continued long after North African and Asian breeds had already migrated.

    When examining the aurochs genome, the scientists extracted a selection of variants that had functional properties. That is, they focused on a relatively small number of variants (SNP) that were identified as having a role in influencing the structure or expression of a gene. There were 166 of these, selected from across the entire genome. An analysis of these genes found that they fell into three interesting categories associated with brain function and behavior, immune response, and growth and metabolism. The scientists speculated that this reflected the selection of cattle suited to domestication. This would result in animals that behaved appropriately, or survived infections that may have become more common if they were living in more crowded conditions. They were also probably selected for their ability to pack on more muscle to provide more meat.

    The scientists then looked at gene variants that were present in the aurochs genome and found to be present with very high frequency in the modern cattle breeds, that is, these particular variants were positively selected during the development of breeds. The traits these genes affected were favorable for meat production, or disease resistance, or behavior in a farm system. One of the selected gene variants that stood out was within the DGAT1 gene, which we know has a major impact on milk production traits (5).

    They also identified variants that occur in microRNAs. MicroRNAs are a relatively recent discovery in mammals and are derived from very small pieces of DNA. Their main role is to influence the amount of a protein that appears from one or more specific genes. They can be quite influential in determining important traits, such as muscle growth (6). In the case of the variant identified in the aurochs DNA sequence, referred to as miR-2893, it has effects on molecules involved in neurological function, fatty acid metabolism, and immune function (2).

    The study reveals a lot about aurochs, and the prehistory of modern cattle breed development. It also cements the aurochs as the origin of the genetic variation that exists in modern breeds; variation that has fueled selection over many centuries to give modern breeds the specialized roles to suit modern farm systems and food production.

    References

    1. Edwards CJ, Magee DA, Park SD, McGettigan PA, Lohan AJ, et al. (2010) A complete mitochondrial genome sequence from a mesolithic wild aurochs (Bos primigenius). PLoS One 5: e9255.
    2. Park SD, Magee DA, McGettigan PA, Teasdale MD, Edwards CJ, et al. (2015) Genome sequencing of the extinct Eurasian wild aurochs, Bos primigenius, illuminates the phylogeography and evolution of cattle. Genome Biol 16: 234.
    3. van Vuure T (2002) History, Morphology and Ecology of the Aurochs (Bos primigenius).
    4. Pryor F (2006) Farmers in prehistoric Britain: The History Press Ltd.
    5. Grisart B, Coppieters W, Farnir F, Karim L, Ford C, et al. (2002) Positional candidate cloning of a QTL in dairy cattle: identification of a missense mutation in the bovine DGAT1 gene with major effect on milk yield and composition. Genome Res 12: 222-231.
    6. Clop A, Marcq F, Takeda H, Pirottin D, Tordoir X, et al. (2006) A mutation creating a potential illegitimate microRNA target site in the myostatin gene affects muscularity in sheep. Nat Genet 38: 813-818.

    Getting the Balance Right

    • More people in the world are overweight than underweight, and the gap between the two continues to widen.
    • The protein leverage hypothesis links the need to meet a target protein intake with weight gain and obesity.
    • Experimental evidence from humans, fruit flies, mice, and several other animals supports this hypothesis, demonstrating that when the amount of available protein energy decreases, energy intake increases.
    • Healthy diets should contain a balance of macronutrients rather than focusing on eliminating a particular food group, like carbohydrates or saturated fats.
    • Whole food sources of protein, including dairy, can help to balance out the low protein, high carbohydrate foods that dominate our environment.

    Once described as an epidemic, obesity has now reached pandemic status with an estimated 600 million obese adults worldwide, and an additional 1.4 billion that are overweight (1). The cause of the pandemic is known—people consuming more energy (calories) than they expend—so it would seem that the solution would be to simply eat less. But a team of nutritional ecologists believes that cutting calories will not solve anything, because it ignores some basic tenets of human (and animal) biology. Using data from fruit flies, mice, birds, fish, monkeys, and humans, Raubenheimer, Simpson and their colleagues demonstrate a seemingly universal law of animal nutrition: a predominant appetite for protein (2-5). They propose that the human need to meet a fixed daily protein target leads to weight gain through the overconsumption of low protein foods that have come to dominate the Western diet. Rather than advocating for a high protein diet that eschews carbohydrates, they emphasize a balance of macronutrients for optimal health. Can dairy help strike this balance? Whole-food sources of protein that are easy to access, like dairy, can help balance out those beloved low-protein, high-carbohydrate processed foods and keep energy consumption in check.

    Human nutritional decisions: an evolutionary perspective

    There are currently more people in the world that are overweight than underweight, and the gap between the two continues to widen (1). What is it that so many populations are doing wrong when it comes to nutritional decisions? The strong connection between Western diets and obesity points the finger of blame at carbohydrates and saturated fats. Indeed, the basic premise of the trendy Paleolithic diet is that modern human metabolism is not adapted to the high carbohydrate and fat intake of the Western diet, and thus, we suffer poor health outcomes such as obesity and type 2 diabetes. Proponents of this viewpoint, most notably Colorado State University physiologist Loren Cordain, argue that because Homo sapiens made their living as hunter-gatherers for the majority of their existence (ca. 200,000 to 10,000 years ago), the optimal human diet recreates the menu of the Paleolithic (6).

    As indicated by the popularity of the Paleolithic diet, this evolutionary perspective on human nutrition has been well received by the public as a satisfactory explanation for our growing waistlines. But many anthropologists and nutritionists are not as enthusiastic. Genetic data strongly support the position that humans have adapted to many nutritional changes that accompanied the transition to agriculture (7). The strongest example of this is the evolution of lactose tolerance in populations that adopted dairy culture, including European dairy farmers and East African pastoralists. Additionally, there is the false belief that our hunter-gatherer ancestors subsisted solely on protein. Archaeological and fossil data indicate that our ancestors ate a much more varied diet, including carbohydrates provided by grains, than is portrayed by the Paleolithic diet (8). Indeed, these diets approach the upper limit of human protein intake. High protein diets (>35% of energy from protein) are considered toxic and are associated with numerous poor health outcomes (2).

    David Raubenheimer, Steven Simpson, and their colleagues take a different evolutionary approach to human nutrition. They acknowledge that modern humans are not biologically identical to our hunter-gatherer ancestors, but argue that they are still animals that have specific macronutrient and micronutrient requirements (4). You can take the Homo sapiens out of the Stone Age, but you can’t take away their need for the proper proportions of nutrients to facilitate growth, immune function, and reproduction. They argue that because of the critical role protein plays in important physiological and biological functions, animals, including humans, prioritize consumption of protein energy over non-protein energy (NPE) (that is, fat and carbohydrates) (2-4). In their model, carbohydrates and fats are not necessarily evil villains, but simply macronutrients that are over-consumed in efforts to meet the optimal protein target.

    Animals, including humans, have separate appetite systems for protein, fat, and carbohydrates. The protein leverage hypothesis (PLH) argues that there is a predominant appetite for protein, and it plays a key role in explaining the rise in overweight (BMI > 25) and obesity (BMI > 30). It predicts that, as the proportion of energy from protein decreases in available foods, animals will increase their consumption of low protein foods in order to meet their protein target. In doing so, they increase their energy intake leading to weight gain and eventually obesity (2-4).

    Imagine an animal that requires 50 grams (g) of protein a day for optimal functioning. In an environment with ample protein, it may be able to access this protein target through the consumption of 1500 kilocalories (kcal) of energy. However, if the amount of protein available in the environment decreases, that animal will now have to consume more low-protein foods in order to meet the 50 g requirement. The intake of fat and carbohydrates will necessarily increase, as will the total daily caloric intake. Humans, they argue, are simply animals living in a low protein environment.

    Of mice and men

    If you want to understand human’s prioritization for protein, you’ll need to go back a lot further in time than the beginning of the Paleolithic. Animals with whom humans share extremely distant relatives—including insects, birds, fish, and other mammals—have demonstrated nutrient-specific appetite systems that prioritize protein (2). That it has also been seen in slime molds (2, 4) (not animals, but part of Kingdom Protista) suggests a very ancient evolutionary adaptation indeed!

    One strong comparative example comes from experiments with mice. When researchers manipulated the macronutrient content of mice food pellets, the mice responded as predicted by the PLH; they prioritized their protein appetite and increased total food intake as the percent protein of the pellets decreased. Consequently, mice on lower protein diets had increased fat gain as a result of increased energy intake (reviewed in 4).

    Although human diets are more difficult to manipulate than those of captive mice, it is still possible to test for the PLH in human subjects. In fact, experimental findings on humans suggest an even stronger prioritization of protein than that found in mice (2). Gosby et al. (3) reviewed 38 experimental trials that manipulated human macronutrient intake (either via a fixed macronutrient composition or increasing the consumption of one or more macronutrients) over 2 or more days while permitting ad libitum (“as much as you please”) intake of energy. Taken together, results support the prediction of the PLH: as percent protein in the diet decreased, energy intake increased.

    It is important to point out that this relationship between percent protein and energy consumption is not identical across the range of protein energy intakes, but instead depends on the target protein intake (for humans, somewhere between 15–20% of total energy – but see discussion below). For example, Gosby et al. found a marked increase in energy intake when protein energy was reduced from 20% to 10%. Above the target, energy intake is predicted to decrease. But across the experiments, increasing protein energy above 20% only a small effect was noted (3). Similar results of “asymmetry in protein leverage” were found in non-human animals, which suggest that the costs of not meeting the protein target (such as poor growth, immune function, and reproduction) must be greater than the costs of excess protein intake (3).

    Food, food everywhere, but not enough protein to eat

    It is difficult to think of humans, many of who regularly have access to any food imaginable, as living in a low-protein environment. Unlike mice and slime molds, we have the benefit of nutritional labels that tell us the precise protein content of virtually every food we would want to consume. Why don’t we simply fill our pantries and refrigerators with more protein sources?

    As Raubenheimer et al. (4) describe it, the amount of protein in our environment has been “diluted,” a pattern that began with the adoption of agriculture thousands of years ago, but that has continued even over the last 40 years. Between 1971-1975, the average proportion of protein energy in the diet was 15.9%, but decreased to 15.4% in 2005 (3). Because protein targets are assumed to be identical between the two time periods, this seemingly small drop in protein energy is associated with a 247 kcal per day increase in energy (3).

    One major reason for this drop is the increased availability of ultra-processed foods (4). These products contain little or no whole food sources and are instead made from sugars, starches, hydrogenated oils and fats, and parts of animal foods. They are also high in calories. Raubenheimer et al. point out that these foods are also quite tricky because they take advantage of the neural circuitry behind the protein appetite through the addition of savory flavors that are usually associated with protein-rich foods (4). When you eat these foods, you activate the reward system that has evolved to promote consumption of essential amino acids and whole protein (discussed in 9), which in turn detracts from counterbalancing the diet with additional protein.

    In addition to filling us up and making us think we are eating more protein than we are, these foods are also, in general, more affordable than whole food sources of protein such as meat, fish, nuts, or dairy. Reviewing a study on macronutrient composition of food and price, Raubenheimer et al. (4) report that although energy density and price were not associated, there was a strong positive association between the protein density of a food and its price. Finally, these foods are “fast,” in terms of quickness of preparation and accessibility. Convenience could be equally as important as price in making food decisions.

    Finding your balance

    Economic and cultural factors linked to low-protein consumption suggest that simply avoiding low-protein foods is an untenable solution. Major nutritional overhauls of our environment are difficult, so solutions that work within the framework we have are likely to be more successful, albeit perhaps on a slower time scale. But if the PLH is correct in its predictions about the cause of obesity, reducing intake of low-protein foods, and adding in whole food sources of protein may limit weight gain and perhaps even promote weight loss.

    Here’s where dairy could play an important role. A glass of milk is “fast” as well, but unlike a bag of cookies comes with around 8 g of protein. Consuming a glass of milk alongside your packaged cookies may limit intake of cookies, which can lower your daily energy intake. It is not about giving up particular foods or nutrients entirely, rather finding balance to reach the optimal macronutrient ratios.

    The optimal protein intake actually varies across humans, so it is not possible to say what the ideal human macronutrient profile would look like. It is clear the protein intakes that approach 30% of energy could have poor outcomes, as too much protein intake can be toxic. The compensatory mechanisms that exist to avoid protein deficits suggest a diet near the protein target may provide the most health benefits. But research shows that just what that target may be for optimal health differs across the human life course. For example, in growing children and adolescents, diets with higher protein content have positive fitness outcomes (less growth impairment, better tissue repair, and body maintenance). Fertility is also impaired on low protein diets, suggesting that reproductive-aged adults would also benefit from a higher protein intake.

    But remember the fat mice on the low protein diet from the study discussed above? They may have gained weight from their low protein, higher carbohydrate diet; however, they lived the longest of any of the mice fed the other experimental diets. Similar dietary manipulations in fruit flies revealed the same pattern—diets with a low protein to carbohydrate ratio appear to prolong the lifespan.

    That there are benefits and costs to diets that have both high and low protein to carbohydrate ratios suggests that there is not one optimal diet. Age is just one known factor that interacts with diet to influence fitness outcomes. Sex, genetic makeup, and activity level are likely other important factors to consider (2). But with the staggering number of obese individuals in Western, as well as, developing countries, there is urgency in understanding how our prioritization of protein may be driving us to consume far more fat and carbohydrates than are needed.

    References

    1. World Health Organization. http://www.who.int/mediacentre/factsheets/fs311/en/
    2. Simpson SJ, Le Couteur DG, Raubenheimer D. 2015. Putting the balance back in diet. Cell 161: 18-23
    3. Gosby AK, Conigrave AD, Raubenheimer D, Simpson SJ. 2014. Protein leverage and energy intake. Obesity Reviews 15: 183-191
    4. Raubenheimer D, Machovsky-Capuska GE, Gosby AK, Simpson S. 2015. Nutritional ecology of obesity: from humans to companion animals. British Journal of Nutrition 113:S26-S39
    5. Johnson CA. Raubenheimer D, Rothman JM, Clarke D, Swedell L. 2013. 30 days in the life: daily nutrient balancing in a wild chacma baboon. PLOS One 8: e70383
    6. Cordain L. 2011. The Paleo Diet. John Wiley & Sons, New York
    7. Hawks J, Want ET, Cochran GM, Harpending HC, Moyzis RK. 2007. Recent acceleration of human adaptive evolution. PNAS 104: 20753-20758
    8. Henry AG, Brooks AS, Piperno DR. 2011. Microfossils in calculus demonstrate consumption of plants and cooked foods in Neanderthal diets (Shanidar III, Iraq; Spy I and II, Belgium). Proceedings of the National Academy of Sciences 108: 486-491
    9. Bertoud H-R, Münzberg H, Richards BK, Morrison CD. 2012. Nueral and metabolic regulation of macronutrient intake and selection. Proceedings of the Nutrition Society 71: 390-400

    Why Mothers Should Boost Their Vitamin D Intake

    • Most infants who are exclusively breastfed don’t get enough vitamin D. They therefore have a higher risk of developing rickets than formula-fed infants.
    • Although health authorities recommend that all breastfed infants receive a vitamin D supplement, compliance is very low.
    • New research suggests that a better alternative to the largely failed infant supplementation strategy is to ensure adequate vitamin D levels in the mother’s breast milk.
    • The researchers recommend that women take a minimum of 4,000 IU vitamin D supplements per day during pregnancy and 6,000 IU per day while lactating—more than 10 times the amounts currently recommended by health authorities.
    • These doses are shown to be safe and effective, and the researchers urge health authorities to update their advice accordingly.

    A mother’s milk is the finest food her baby can get, but it’s not perfect—or so it seems. It has become clear in recent years that most infants don’t get enough vitamin D from breast milk—not by a long shot. Does this mean breast milk is inherently flawed, by some quirk of nature? A new study refutes this common belief by demonstrating that breast milk can indeed provide babies with enough vitamin D if their mother cranks up her vitamin D intake by more than 10 times the currently recommended amount (1).

    Vitamin D deficiency and rickets—a childhood bone disease—are re-emerging as serious health issues for babies and children in the U.S. and many other countries, as highlighted in recent issues of SPLASH! milk science update (2, 3). The reasons for this are limited sunlight exposure and low vitamin D consumption, not only in the young ones, but also in pregnant and lactating mothers who pass this vital nutrient on to their babies via the placenta and breast milk.

    Exclusively breastfed infants have a higher risk of developing rickets than those on formula, which is fortified with vitamin D. This is especially true for African American breastfed babies because dark pigmented skin absorbs less sunlight than light skin, therefore generating lower amounts of the “sunshine vitamin.”

    Mothers more likely to take their own vitamin D than to supplement their infants

    For decades, the American Academy of Pediatrics (AAP) and the Institute of Medicine (IOM) have recommended that all breastfed infants receive vitamin D supplements, starting within a few days of birth. The problem is that the recommendation is rarely followed—compliance ranges from some 2% to 20%. This may be due to inaccessibility to supplements, or simply a lack of awareness (4). But ultimately, it comes down to parental reluctance to supplement a healthy, breastfeeding infant, according to Professors Bruce Hollis and Carol Wagner from the Medical University of South Carolina (MUSC) Children’s Hospital—two leading researchers on vitamin D in mothers and infants.

    “We have shown in various vitamin D studies that mothers are more likely to take their own vitamin D supplement than to give their infants a supplement. It also goes against what every breastfeeding mother knows—that breast milk is best and if she is healthy, including vitamin D replete, she does not have to supplement her breastfeeding infant,” Hollis and Wagner explain.

    They also point out that a multivitamin supplement (containing vitamin D), which may be prescribed instead of vitamin D-only supplement, often tastes bad and babies tend to spit it out. “This makes it difficult for a mother to know how much her infant actually got of the supplement. The best option, in our minds, is for the mother to have a healthy vitamin D status, thereby giving her milk that same status.”

    How much vitamin D supplement is enough?

    Health authorities’ current recommendation for pregnant and lactating women, as well as for infants and children, is a minimum daily intake of 400 international units (IU) vitamin D. However, Hollis and Wagner estimated this dose to be far too low, and designed an interventional study to test their hypothesis (1). The study was a randomized, double-blind, controlled trial—the gold standard for testing the clinical effects of various substances, diets, or treatments.

    The team recruited a number of lactating mothers living in South Carolina or New York, who agreed to exclusively breastfeed their one-month old baby for the next six months. The mothers and infants were randomly assigned to receive different amounts of vitamin D or a placebo control (no vitamin D). Once a month, their vitamin D levels were analyzed from blood and urine samples.

    As in most clinical trials, not all of the initially enrolled participants completed the entire study program. Many stopped exclusively breastfeeding along the way, and some dropped out because they moved or lost interest. But enough participants remained in the study to provide the researchers with reliable data.

    Of the 95 babies who were fully breastfed throughout the study period, three quarters were deficient in vitamin D at the study start. From four months of age and until the study completion, though, all the babies had marked improvement in vitamin D levels. And here is what’s interesting: the babies whose only source of vitamin D was breast milk from mothers receiving 6400 IU vitamin D per day had just as good vitamin D levels as those babies receiving a daily supplement of 400 IU vitamin D, and whose mothers also received the recommended 400 IU vitamin D per day.

    In other words, the study showed that vitamin D supplementation to the mother only—at a high dose of 6400 IU per day—can ensure that adequate amounts of vitamin D are transferred from the mother’s blood to her milk, thus meeting her nursing infant’s need.

    “During lactation, a woman needs about 20% extra vitamin D intake, as about that much is excreted daily into her breast milk,” Hollis and Wagner explain. Based on their own and others’ research over the last 15 years, they recommend that women take a minimum of 4,000 IU vitamin D per day—depending on their vitamin D status—during pregnancy and 6,000 IU per day while lactating. “Our obstetrical department at MUSC is now testing all pregnant women at their first prenatal visit.”

    Is it safe? New data versus old misunderstandings

    Importantly, and consistent with safety data from earlier studies in thousands of people, their latest study showed no sign that this dosing was in any way unsafe. Moreover, the mothers’ blood levels of vitamin D following the high dose were similar to vitamin D levels that can be achieved by solar exposure alone, without dietary supplementation (1).

    A perceived safety issue with high intake of vitamin D has been around for a long time (5). Nearly a century ago, reports of vitamin D toxicity surfaced after children and adults were prescribed vitamin D at doses of hundreds of thousands IU—way higher than the physiological range. The toxic effects involved hypercalcemia, or abnormally high blood levels of calcium, the metabolism of which involves vitamin D. In the 1960s, doctors linked excess vitamin D to a type of hypercalcemia characterized by “elfin facies”—some peculiar, elf-like facial features in the children afflicted by the condition, as well as in children born with a syndrome called supravalvular aortic stenosis (SAS). As a result, vitamin D toxicity—brought about by maternal vitamin D supplementation during pregnancy—came to be viewed as the cause of the SAS syndrome. Later it turned out, however, that SAS is not caused by too much vitamin D per se, but by a genetic disorder called Williams’ syndrome. Sufferers often have abnormal vitamin D metabolism and are therefore susceptible to hypercalcemia. Unfortunately, the old and inaccurate association of the disease with vitamin D intake has had a profound effect on the practice of vitamin D supplementation in pregnant mothers, as well as in infants. A lack of understanding and fear of excess vitamin D continues today.

    This is reflected in the conservative recommendations for daily vitamin D intake (400 IU). In recent years, however, the IOM and the Endocrine Society have suggested upper intake limits of 4,000 IU and 10,000 IU per day, respectively (6, 7).

    “Change now to make a difference in these babies’ lives”

    Our current lifestyles, including sun exposure, have changed drastically since our distant past, meaning dietary supplementation of vitamin D is our only remedy, according to Hollis and Wagner—but in amounts that matter, as shown in their paper. That is, at least 10 times the currently recommended amounts.

    So, are the official recommendations likely to be updated any time soon, in keeping with the latest research? Hollis and Wagner suspect health authorities will call for more research, as usual. “But the truth is, our research is unlikely to be repeated because of cost. Our study cost was around 7 million dollars,” they say. “Our position is that we have the data; it is convincing, so implement the change now to make a difference in these babies’ lives.”

    And the difference—in terms of health risks—for babies getting their daily, deserved dose of vitamin D is crucial. It not only affects their bone and calcium metabolism, but also the integrity of their immune system, as well as various organ systems (5). Increasingly, new research shows that vitamin D deficiency is linked to several inflammatory and long-latency diseases, including multiple sclerosis, diabetes, and various cancers. Armed with this emerging knowledge, health authorities and pediatric health care providers need to put measures in place to ensure babies get enough vitamin D—for the best possible start in life.

    References

    1. Hollis BW, Wagner CL Howard CR, Ebeling M, Shary JR, Smith PG, Taylor SN, Morella K, Lawrence RA, Hulsey TC (2015). Maternal versus infant vitamin D supplementation during lactation: a randomized controlled trial. Pediatrics 136: 625–634
    2. Sando L (2015). Breastfeeding and vitamin D deficiency. SPLASH! milk science update: April 2015 (https://milkgenomics.org/splash/breastfeeding-and-vitamin-d-deficiency/)
    3. Newmark LM (2015). Children who avoid cow’s milk may fall short of vitamin D. SPLASH! milk science update: June 2015 (https://milkgenomics.org/splash/children-who-avoid-cows-milk-may-fall-short-of-vitamin-d/)
    4. Gordon CM, Feldman HA, Sinclair L, Williams AL, Kleinman PK, Perez-Rossello J, Cox JE (2008). Prevalence of vitamin D deficiency among healthy infants and toddlers. Arch Pediatr Adolesc Med 162: 505–512
    5. Wagner CL, Taylor SN, Hollis BW (2008). Does vitamin D make the world go ‘round’? Breastfeed Med 3: 239–250
    6. Institute of Medicine (US) Committee to Review Dietary Reference Intakes for Vitamin D and Calcium; Editors: Ross AC, Taylor CL, Yaktine AL, Del Valle HB (2011). Dietary Reference Intakes for Calcium and Vitamin D. National Academies Press (US), Washington DC
    7. Holick MF, Binkley NC, Bischoff-Ferrari HA, Gordon CM, Hanley DA, Heaney RP, Murad MH, Weaver CM (2011). Evaluation, treatment, prevention of vitamin D deficiency: an Endocrine Society clinical practice guideline. J Clin Endocrinol Metab 96: 1911–1930

    Milk and Potatoes Made the World Go Round

    • Five centuries ago, a population in which lactose tolerance was common had far better growth prospects than one in which the trait was rare.
    • The introduction of the potato to different parts of Europe in the 16th century is thought to have had a similarly strong impact on population densities during the 18th and 19th centuries.
    • Both milk and potatoes offer nutritional and calorific improvements over the alternative outputs that could have been generated from the same agricultural inputs—a herd of cattle and an area of land.
    • The accessibility of these foodstuffs together offered particular advantages for population growth and urbanization.

    Population growth, urbanization and, as a consequence, economic and political development owe much to humankind’s ability to make use of two humble foodstuffs: milk and potatoes. That is the theory put forward by development economist C. Justin Cook, of the University of California, Merced.

    In his first piece of research (1), Cook demonstrated the significance of the population frequency of lactose tolerance in world history over the past half-millennium. Beginning with data on how different present-day populations perform on a hydrogen breath test (a medical evaluation of the symptoms of lactose intolerance), Cook rewound the clock of population frequencies by five centuries. He did this by adjusting the numbers according to the known population flows of different ethnic groups.

    Cook then constructed a model to statistically remove the influence of different climates and levels of technology on population growth at the time—along with other potentially important variables. According to his model, if the proportion of lactose-tolerant people in a population were to have risen by 24% in the year 1500, the population density would have settled at a level 40% higher than it started. Back then, “Any increase in income is going to be offset by population,” says Cook. “Pre-industrial economies functioned in this way.” In short, if you could feed more kids, you had more kids.

    Those numbers may sound dramatic, but in the context of world history, they are barely a ripple. Lactose tolerance rates vary wildly among present day populations. For example, 96% of Swedes are able to digest the sugar, compared with only 2.3% of Zambians. Indeed, the genetic innovation that is associated with the ability to digest lactose into adulthood in Europe is often considered one of the fastest spreading human gene variants known.

    In Cook’s view, lactose tolerance increased population density so effectively for several reasons. One reason is that milk offers a highly portable source of calories—more calories from a fixed number of cattle than raising and killing them for meat would have. The nutrients in milk would also have helped populations stay healthy with a mixture of complex fats, essential amino acids, and various vitamins and minerals—including calcium to increase bone density.

    Furthermore, the ability of infants to digest cow’s milk would have enabled mothers to wean their offspring earlier. And this would have sooner freed up women in pre-industrial economies to give birth again, resulting in a higher lifetime reproductive output. (Although Cook admits this last mechanism is more speculative than the others.)

    Aside from the latter point about reproduction, such arguments underscoring the importance of dairy in the diet must have seemed familiar to researchers who had not long before estimated the historical benefits of potatoes. When Cook’s paper was published, Nathan Nunn and Nancy Qian, of Harvard and Yale Universities, respectively, had reported (2) that potatoes account for about a quarter of the rise in population and urbanization in the ‘Old World’ during the 18th and 19th centuries.

    Potatoes, of course, originate from South America. Therefore, Nunn and Qian’s argument focused on the vegetable’s introduction to Europe—which initially resulted from Christopher Columbus’s voyages. Sailors and missionaries then spread potatoes all around Europe and beyond. Just as milk offers more calories and nutrients from a herd of cattle over time than does eating them, the potato outperformed existing crops that were grown in Europe before their introduction. Nunn and Qian’s logic was the same as Cook’s: the more mouths that could be fed, the more babies were made.

    With such closely related findings, the obvious next step was to combine them somehow. Hence, in more recent work (3), Cook reported the results of a model of population and urbanization growth in Europe from 1700-1900, which uses both the potato introduction data, and the adjusted lactose tolerance survey data to predict the patterns seen.

    In short, Cook has found that potatoes and milk were synergistic. During this period, countries where the population was generally able to consume milk benefited more from the introduction of the potato than countries where this wasn’t the case. That makes sense because the two common shopping bag items are nutritional complements, each filling in the most obvious holes in a healthy diet left by unique consumption of the other. Specifically, populations in which everybody was lactose tolerant grew about twice as fast when the potato was introduced as populations in which nobody was.

    But not everything makes sense in the data. A clear outlier to the rest of the patterns for lactose tolerance and population growth is China. “It’s the outlier,” says Cook. “China has a huge population and very low lactose tolerance… Maybe they have other kinds of beans [for protein]? I’ve thought about it a lot. But I don’t know what’s special about China.” Nonetheless, Cook seems to be successfully unraveling the history of humanity, one foodstuff at a time.

    References

    1. Cook. C. J. The Role of Lactase Persistence in Precolonial Development. Journal of Economic Growth, 19, 369-406 (2014)

    2. Nunn, N., & Qian, N. The Potato’s Contribution to Population and Urbanization: Evidence from a Historical Experiment. Quarterly Journal of Economics, 126(2), 593-650 (2011)

    3. Cook. C. J. Potatoes, Milk, and the Old World Population Boom. Journal of Development Economics, 110, 123-138 (2014)

    Back to SPLASH!® Home