Five Reasons the Diet Soda Myth Won’t Die
Repeated studies on a health bogeyman help explain wider problems with food research.
There’s a decent chance you’ll be reading about diet soda studies until the day you die. (The odds are exceedingly good it won’t be the soda that kills you.)
The latest batch of news reports came last month, based on another studylinking diet soda to an increased risk of early death.
As usual, the study (and some of the stories) lacked some important context and caused more worry than was warranted. There are specific reasons that this cycle is unlikely to end.
1. If it’s artificial, it must be bad.
People suspect, and not always incorrectly, that putting things created in a lab into their bodies cannot be good. People worry about genetically modified organisms, and monosodium glutamate and, yes, artificial sweeteners because they sound scary.
But everything is a chemical, including dihydrogen monoxide (that’s another way of saying water). These are just words we use to describe ingredients. Some ingredients occur naturally, and some are coaxed into existence. That doesn’t inherently make one better than another. In fact, I’ve argued that research supports consuming artificial sweeteners over added sugars. (The latest study concludes the opposite.)
2. Soda is an easy target
In a health-conscious era, soda has become almost stigmatized in some circles (and sales have fallen as a result).
It’s true that no one “needs” soda. There are a million varieties, and almost none taste like anything in nature. Some, like Dr Pepper, defy description.
But there are many things we eat and drink that we don’t “need.” We don’t need ice cream or pie, but for a lot of people, life would be less enjoyable without those things.
None of this should be taken as a license to drink cases of soda a week. A lack of evidence of danger at normal amounts doesn’t mean that consuming any one thing in huge amounts is a good idea. Moderation still matters.
3. Scientists need to publish to keep their jobs
I’m a professor on the research tenure track, and I’m here to tell you that the coin of the realm is grants and papers. You need funding to survive, and you need to publish to get funding.
As a junior faculty member, or even as a doctoral student or postdoctoral fellow, you need to publish research. Often, the easiest step is to take a large data set and publish an analysis from it showing a correlation between some factor and some outcome.
This kind of research is rampant. That’s how we hear year after year that everyone is dehydrated and we need to drink more water. It’s how we hear that coffee is affecting health in this way or that. It’s how we wind up with a lot of nutritional studies that find associations in one way or another.
As long as the culture of science demands output as the measure of success, these studies will appear. And given that the news media also needs to publish to survive — if you didn’t know, people love to read about food and health — we’ll continue to read stories about how diet soda will kill us.
4. Prestigious institutions and the press
To do the kinds of analyses described here, you need large data sets that researchers can pore over. Building the data set is the hardest part of the work.
Analyzing the numbers on hundreds of thousands of people isn’t child’s play. But gathering the data is much more expensive and time-consuming.
Because of this, a few universities produce a disproportionate amount of the research on these topics. They also tend to be the universities with the most resources and the most recognizable names. Because they’re also usually prestigious, they attract more researchers and more funding to build bigger and newer data sets.
They also get more media attention because of having access to more researchers, prestige and funding. If the research is coming out of a super-respected institution, it must be important.
Lather. Rinse. Repeat.
5. We still don’t understand the limitations of observational studies
No matter how many times you stress the difference between correlation and causation, people still look at “increased risk” and determine that the risk is causing the bad outcome. For reporting on hundreds of thousands of people, observational studies are generally the only realistic option. With very few exceptions, they can tell us only if two things are related, not whether one is to blame for the other (as opposed to randomized control trials).
With respect to diet sodas, it’s plausible that the people who tend to drink them also tend to be worried about their weight or health; it could be a recent heart attack or other health setback that is causing the consumption rather than the other way around. But you shouldn’t assume that diet sodas cause better health either; it could be that more health-conscious people avoid added sugars.
Many of these new observational studies add little to our understanding. At some point, a study with 200,000 participants isn’t “better” than one with 100,000 participants, because almost all have limitations — often the same ones — that we can’t fix.
Dr. John Ioannidis wrote in a seminal editorial: “Individuals consume thousands of chemicals in millions of possible daily combinations. For instance, there are more than 250,000 different foods and even more potentially edible items, with 300,000 edible plants alone.”
And yet, he added, “much of the literature silently assumes disease risk” is governed by the “most abundant substances; for example, carbohydrates or fats.” We don’t know what else is at play, and using observational studies, we never will.
(Observational research is still the best way to study population-wide risk factors; sophisticated techniques like regression discontinuity can even create quasi-randomized groups to try to get closer to understanding causality. Too few employ such techniques.)
Moreover, too many reports still focus only on the relative risk and not on the absolute risk. If a risk increases by 10 percent, for example, that sounds bad. But if the baseline risk is 0.1 percent, that 10 percent increase winds up moving the baseline to only 0.11 percent.
It would probably be a public service if we stopped repeating a lot of this research — and stopped reporting on it breathlessly. If that’s impossible, the best people can do is stop paying so much attention.
Aaron E. Carroll is a professor of pediatrics at Indiana University School of Medicine and the Regenstrief Institute who blogs on health research and policy at The Incidental Economistand makes videos at Healthcare Triage. He is the author of “The Bad Food Bible: How and Why to Eat Sinfully.” @aaronecarroll
https://www.nytimes.com/2019/10/14/upshot/diet-soda-health-myths.html?fbclid=IwAR3h2abWB8fbKlXk2yKs0PQHDnSKaADPtVHN8pclA2Qka7VBmxP1-ao-jQw
Nenhum comentário:
Postar um comentário