Ah, milk! Widely recognized as a nutritious drink for people of all ages thanks to its excellent combination of protein, calcium, vitamin D, potassium, and a range of other useful vitamins and minerals, some people simply can’t go without their favorite ‘comfort food’, along with mozzarella, parmesan cheese, gelato ice cream and all other dairy derivatives people have skillfully made across the ages.
It all started about 9,000 years ago when the first goat, sheep, and cow farmers realized that their newly domesticated animals were good for more than just meat. There was just one problem though: all of these early dairy adopters were lactose intolerant. I guess all those tummy issues were balanced by, well, subdued hunger. People couldn’t really afford to be picky during those times.
But then something happened that has been puzzling evolutionary scientists for a long time — genes for lactose tolerance started to spread rapidly and become prevalent across Europe around 3,000 years ago. In just a few thousand years, the trait became rather widespread in the region, a rate of genetic acquisition that is almost unheard of among humans.
Lactase is an enzyme that breaks down lactose (a major milk carbohydrate) in the body allowing us to digest it, but it is typically produced in newborns and lost as animals age. There are a couple of explanations for why lactase persistence (remaining present in adulthood) developed so rapidly, and almost all have something to do with milk’s extraordinary nutritional value. Milk was so good that people must have been guided by some invisible force to better tolerate the food. That’s not how evolution works, though. That’s akin to saying that if you simply train to run marathons every weekend, then your offspring have a better chance of being born with a propensity for athleticism. Again, that’s not how it works.
Other more scientifically sensible hypotheses suggest that lactose persistence arose due to selection pressures. People who could better tolerate drinking milk and eating dairy products had a survival edge over those that didn’t, so they birthed more children who inherited this trait and, in time, the trait spread widely among European populations. But selection pressure alone, under normal conditions, is not really enough to explain the lighting fast rate of adaptation.
A new study by researchers at the University of Bristol and University College London has a different and more plausible explanation — even if its historical circumstances aren’t the most pleasant.
Milk may give you stomach cramps, but during times of famine and disease, it could have killed you
Using evidence from over 7,000 animal fat residues found in ancient pottery at over 500 sites across Europe, as well as ancient DNA, the researchers designed a model that showed the use of milk in Europe did not actually correlate with the spread of lactase persistence, which is what you would have expected if you believe the ‘drink more milk, get more lactose tolerant’ theory. What they instead found was the spread of gene variants that allowed the breaking down of sugars found in milk better aligned with periods of famine and disease. The association was actually extremely strong: lack of food was 689 times more likely to explain the spread of lactase persistence than selection pressure under normal conditions, while disease was 289 times more likely to explain the rise of this tolerance.
Mark Thomas, an evolutionary geneticist at University College London and lead author of the new study, explained for ZME Science:
“The appearance is just a random mutation event. Its dramatic rise in frequency is the interesting thing. There are many theories for why this happened, but almost all of those theories relate to the extent of milk use. We find that the extent of milk use does not help to explain the rise in frequency of the lactase persistence gene variant. Instead, we find those past famines and/or increased pathogen exposure better explain its turbocharged rise in frequency. This is likely because, under conditions of either increased pathogen loads, or severe malnutrition, the effects of lactase non-persistent people drinking milk can turn from nothing / inconvenient/unpleasant/embarrassing ones to lethal ones.”
This conclusion is the result of years of interdisciplinary work, which, when faced with such complex questions, is the only way forward, Thomas tells me.
It all started with a friendly meeting over drinks in a pub in Bristol in 2018 between Thomas, Richard Evershed, professor of chemistry at the University of Bristol, and George Davey Smith, professor of clinical epidemiology at the University of Bristol. During this meeting of the minds, they realized they could play each of their strengths to answer one of the most enigmatic puzzles in evolutionary biology.
“I’d been working on the evolution of lactase persistence for many years and thought I knew it all. I was happy to learn that evening that I didn’t,” says Thomas.
Evershed and Melanie Roffet-Salque, a chemist known for her study of lipids from archaeological artifacts, organized an enormous database of animal fats from more than 13,000 ancient shards of pottery found across Europe. George Davey Smith and colleagues probed the UK Biobank — a biomedical database and research resource containing genetic, lifestyle and health information from half a million Brits — to see what effect lactase persistence has on milk drinking habits and health indicators in living healthy people. And Thomas, along with colleagues Yoan Diekmann and Adrian Timpson, developed a new statistical method that tests how time series ecological drivers may explain the evolution of lactase persistence.
In this wealth of data, the researchers could finally see a pattern.
Lactose intolerance is generally not the worst thing in the world. In fact, the term is a misnomer since all humans are born lactose tolerant, as all mammals are (mama’s milk, remember?). However, as we transition into adulthood, we grow increasingly unable to digest lactose, a type of sugar found in milk. Symptoms include the familiar bloated bowels, stomach cramps and pains, diarrhea, and a general sensation of feeling sick. All tolerable during ancient times when 99% of the population was living hand-to-mouth.
However, during periods of widespread disease and famine, diarrhea in severely malnourished people could and likely was fatal. During such times, those who possessed alleles for lactose dehydrogenase, the gene required for metabolizing lactose into glucose, didn’t just have a slight edge over everyone else — they had a huge survival advantage.
This picture perhaps better explains why over 95% of people in Denmark are lactose tolerant, whereas 85% of people in China are lactose intolerant. To this day, around 70% of the world’s population is lactose intolerant, something that can be explained by the fact that each population was generally confined to a particular region for the last 10,000 years, each with its unique historical challenges.
“In a nutshell, milk use was widespread in Europe for at least 9,000 years, and healthy humans, even those who are not lactase persistent, could happily consume milk without getting ill. However, drinking milk in lactase non-persistent individuals does lead to a high concentration of lactose in the intestine, which can draw fluid into the colon, and dehydration can result when this is combined with diarrhoeal disease” said Davey Smith in a press release. “I postulated that this process could lead to high mortality when infectious disease burdens increased, as population sizes and densities increased to levels at which some infectious agents could continuously circulate within them.”
The findings appeared in the journal Nature.