Photo by Ciro De Luca / Reuters.
As a young child I missed a question on a psychological test: “What comes in a bottle?”
The answer was supposed to be milk. I said beer.
Milk almost always came in cartons and plastic jugs, so I was right. But this isn’t about rehashing old grudges. I barely even think about it anymore! The point is that the test was a relic of a time before me, when milk did come in bottles. It arrived on doorsteps each morning, by the hand of some vanishing man. And just as such a world was alien to me as a kid, the current generation of small children might miss a similar question: “Where does milk come from?”
Many would likely answer almonds or beans or oats.
Indeed, the already booming nut-milk industry is projected to see continuous growth. Much of this is driven by beliefs about health, with ads claiming “dairy free” as a virtue that resonates for nebulous reasons—many stemming from an earlier scare over saturated fat—among consumers lactose intolerant and tolerant alike. The dairy industry is now scrambling to market milk to Millennial families, as the quintessential American-heartland beverage once thought of as necessary for all aspiring, straight-boned children has become widely seen as something to be avoided.
Should it be?
It all happened quickly. In the 1990s, during the original “Got Milk?” campaign, it was plausible to look at a magazine, see supermodels with dairy-milk mustaches, and think little of it. Now many people would cry foul. With nut milks dominating the luxury café-grocery scenes frequented by celebrities, an image like that would surely elicit cries of disingenuousness: There’s no way you actually drink cow’s milk! And if you do, it’s probably skim or 2-percent milk, which leave no such thick mustache!
Difficult as it may be for Millennials to imagine, the average American in the 1970s drank about 30 gallons of milk a year. That’s now down to 18 gallons, according to the Department of Agriculture. And just as it appears that the long arc of American beverage consumption could bend fully away from the udder, new evidence is making it more apparent that the perceived health risks of dairy fats (which are mostly saturated) are less clear than many previously believed.
A study published in The American Journal of Clinical Nutrition is relevant to an ongoing vindication process for saturated fats, which turned many people away from dairy products such as whole milk, cheese, and butter in the 1980s and ’90s. An analysis of 2,907 adults found that people with higher and lower levels of dairy fats in their blood had the same rate of death during a 22-year period.
The implication is that it didn’t matter if people drank whole or skim or 2-percent milk, ate butter versus margarine, etc. The researchers concluded that dairy-fat consumption later in life “does not significantly influence total mortality.”
“I think the big news here is that even though there is this conventional wisdom that whole-fat dairy is bad for heart disease, we didn’t find that,” says Marcia de Oliveira Otto, the lead researcher of the study and an assistant professor of epidemiology, human genetics, and environmental science at the University of Texas School of Public Health. “And it’s not only us. A number of recent studies have found the same thing.”
Hers adds to the findings of prior studies that also found that limiting saturated fat is not a beneficial guideline. While much similar research has used self-reported data on how much people eat—a notoriously unreliable metric, especially for years-long studies—the current study is noteworthy for actually measuring the dairy-fat levels in the participants’ blood.
A drawback to this method, though, is that the source of the fats is unclear, so no distinction can be made between cheese, milk, yogurt, butter, etc. The people with low levels of dairy fats in their blood weren’t necessarily dairy free, but they may have been consuming low-fat dairy. All that can be said is that there was no association between dairy fats generally and mortality.
The researchers also found that certain saturated fatty acids may have specific benefits for some people. High levels of heptadecanoic acid, for example, were associated with lower rates of strokes.
De Oliveira Otto believes that this evidence is not itself a reason to eat more or less dairy. But she said it could encourage people to give priority to whole-fat dairy products over those that may be lower in fat but higher in sugar, which may be added to make up for a lack of taste or texture. She points to the classic example of chocolate milk, the low-fat varieties of which are still given to schoolchildren under the misguided belief that it is a “health food.”
The latest federal Dietary Guidelines for Americans, which guide school lunches and other programs, still recommend “fat-free or low-fat dairy.” These guidelines are issued by the U.S. Department of Agriculture, so they have long been biased toward recommending dairy consumption in a country that is rich in dairy-production infrastructure. Veganism is not encouraged given a national interest in continuing to consume the dairy the country produces. But promoting low-fat and fat-free dairy over whole milk has no such economic defense.
The takeaway is that, from a personal-health perspective, dairy products are at best fine and reasonable things to eat, and avoiding butter and cheese is less important than once believed. While the narrative that cheese and butter are dangerous is changing, it also remains true that dairy isn’t necessary for children or adults. A diet rich in high-fiber plants has more than enough protein and micronutrients to make up for a lack of dairy—and the vitamin D that’s added to milk can just as well be added to other foods, taken as a supplement, or siphoned from the sun.
With every new study that tells us more about the complexities of human nutrition and stymies efforts to fit nutrients into simple good-bad binaries, the easier it should be to direct our concerns productively. This study is another incremental addition to an ever-expanding body of knowledge, the point of which is that we should worry less about the harmful effects of single nutrients and more about the harms done by producing food. At this point, the clearest drawbacks to consuming animal products are not nutritional but environmental, with animal agriculture contributing to antibiotic resistance, deforestation, and climate change. While there is room for debate over the ideal amounts of saturated fat in human blood, the need to move toward an environmentally sustainable food system is unambiguous.