Written by: Lauren Milligan Newmark, Ph.D. | Issue # 112 | 2023
- All humans are born with the ability to digest the milk sugar lactose, but only 35% of the global population has a genetic variant that keeps the production of the lactase enzyme turned on through adulthood.
- The continued production of lactase, called lactase persistence (LP), has long been linked to milk use, but a new study found no evidence to support this claim among prehistoric European dairy farmers.
- The study proposes that famine and pathogen exposure better explain the evolution of LP in Europe and could potentially explain the evolution of LP in other parts of the world as well.
Lactase persistence (LP) has been the textbook example for a genetic adaptation to the human diet for decades. But despite its renown, the evolutionary advantage of the LP phenotype—the ability of humans to digest the milk sugar lactose throughout the lifespan—is still under debate.
“LP is the Everest of natural selection over the last 10,000 years,” explains Mark Thomas, Professor of Evolutionary Genetics at University College London. “There are a lot of different theories on why natural selection was so strong on LP, and they all relate to milk use.” The most widely discussed evolutionary scenario argues that as milk use increased among prehistoric peoples, LP individuals were more likely to survive and reproduce than lactase non-persistent (LNP, or lactose intolerant) individuals because they could reap the nutritional benefits of milk and milk-derived foods without digestive issues.
Although this narrative sounds good on paper, a new study [1] from Thomas, his colleagues Richard Evershed and George Davey Smith from University of Bristol, and a team of over 100 researchers found no supportive evidence among prehistoric Europeans. “Milk use doesn’t explain LP selection, and that’s a real shocker,” says Thomas.
Thomas and his team came to this startling conclusion by creating novel statistical models that incorporated the most up to date genetic and archaeological data from Europe spanning the period from 7000 BC to AD 1500 [1]. Ancient DNA (aDNA) data from more than 1,700 skeletal remains were used to estimate the frequency of the LP genetic variants (or alleles) in prehistoric European populations across time and space. If the frequency of LP alleles were increasing faster than expected by chance, it meant that the alleles were under selection. The frequency of milk use was determined from evidence for milk fat residues on potsherds (ceramic fragments). Their model included nearly 7,000 animal fat residues from over 13,000 potsherds from 554 sites across Europe. From these, they created time series showing how frequently people were using milk over the nearly 9,000 years they surveyed.
Going against the prevailing narrative, Thomas and his team found that patterns of milk usage did not explain changes in LP allele frequencies any better than did uniform selection since the start of the Neolithic (ca. 10,000 years ago) [1]. But if milk use doesn’t account for intense selection on the LP allele, what does? Famine and pathogens, says Thomas. “Healthy people that are LNP and drink milk are unlikely to die from diarrhea,” he explains. “But if you have severe malnutrition and you drink fresh milk and get diarrhea, you have a high chance of dying.” Diarrhea caused by an inability to digest lactose would also be an issue for anyone fighting an infection, as dehydration from diarrhea increases the risk of dying from many pathogens.
Thomas and his team predicted that selection on LP alleles would increase with greater pathogen exposure and would be greatest during times of subsistence instability. But without written records detailing crop failures or plagues, they needed to find proxies to test their hypothesis with the data at hand. For famine, they used archaeological data to determine changes in population size; if the population size was increasing over time, and then it dropped, it was likely the result of famine. For pathogen exposure, the researchers looked at changes in the density of population settlements; as people start living in more dense settlements, the risk of exposure to pathogens is more likely.
Using data from over 110,000 radiocarbon dates from 27,000 European sites, their models found that pathogen exposure was 284 times more probable, and famine 689 times more probable than a model of constant selection to drive the selection of LP. “We absolutely could have better proxies for famine and disease,” says Thomas. “But there is no reason to think they would give numbers like these if they weren’t involved in some way.”
This study only looked at data from prehistoric Europe, but LP independently evolved in other parts of the world as well. Is it possible that similar selective pressures of famine and disease drove the evolution of LP in Africa and Asia? “Absolutely,” says Thomas. “All of these populations were exposed to pathogens and famine. What we have is a panacea explanation that works just as well in Africa and the Middle East.” At present, there is not nearly the same amount of data on milk fats from pottery or aDNA from Africa or Asia as was available from Europe to allow for a rigorous test of their hypothesis. But Thomas is hopeful that in the future they’ll have enough data to be able to address their questions among non-European populations. In the meantime, anthropology textbooks may want to revisit their discussion of LP and milk use. “These long-held ideas about milk use and LP just don’t hold up,” says Thomas.
References
- Evershed RP, Smith GD, Roffet-Salque M, Timpson A, Diekmann Y, Lyon M, Cramp L, Casanova EJ, Smyth J, Whelton HL, Dunne JB, et al. Dairying, diseases and the evolution of lactase persistence in Europe. Nature. 2022 Feb 16.