Skip to Main Content

The news is everywhere in my social news feeds this morning: A popular fad diet is apparently lethal, scientific research says. Specifically, a study found that caloric restriction, also known as intermittent fasting, has a 91% higher risk of death due to cardiovascular disease.

Except scientific research doesn’t say that — and not only should you not be worried about this study, you shouldn’t be wasting brain glucose thinking about it. Even including that 91% number, which you’ll remember, caused me pain, because I don’t think this result should be remembered.

advertisement

The study is a type of nutritional research that is notoriously weak, and right now it’s only available as a press release. It’s not clear from the many, many news articles on the study whether reporters actually viewed the data that will be presented at an upcoming research meeting held by the American Heart Association.

So how am I, a science journalist, confidently dismissing this research? It’s based on observational research, and one lesson from more than 20 years of reporting on health and medicine is that one should be very skeptical of observational research, especially when it is about nutrition.

In this case, researchers used a really useful research tool, the National Health and Nutrition Examination Survey (NHANES), a survey given to 5,000 people a year about eating and dietary habits, as a starting point. These data were linked by the researchers to a separate database of deaths. Both the survey and the database of deaths are administered by the Centers for Disease Control and Prevention.

advertisement

Such databases allow researchers to quickly check to see if dietary choices seem to be associated with health problems. That’s great, because they can help scientists set the direction of more rigorous research that could take years. But the answers that come from doing that are not necessarily reliable.

Part of the problem, the easy-to-understand part, is that people answering surveys are not always entirely honest. More than that, especially with food, we often misremember what we’ve eaten and how much. For instance, we might think we followed our diet and totally forget when we slipped up.

But the bigger problem is that the people who choose to be on a diet, or those who stay on it, might be fundamentally different from those who don’t in ways that we cannot measure. Perhaps people go on time-restricted diets because they are worried about their health. Perhaps the people who stay on such diets have bodies that work differently than those who can’t fast that long. Perhaps, for whatever reason, the people who were on the diet were different from those who were not simply by random chance.

Researchers try to counteract these possibilities by “controlling for” the risk factors they know, like body weight and biological sex or gender or age. But the problem is that researchers can only control for the factors they can identify.

Let’s look at an example where these phenomena were at play: the decades-long story of whether red wine prevents heart attacks. Originally researchers posited a “French paradox” — that red wine let Parisians down croissants, foie gras, boeuf bourguignon, raclette, and moules frites without the heart attacks the researchers expected because at the time they thought any high-fat diet increased the risk of heart disease. This eventually morphed into the idea that very moderate drinking (no more than a glass of wine a day) had a beneficial effect on heart disease.

Except recently some researchers have argued that this apparent benefit isn’t there — it just looked that way because moderate drinkers were healthier than others in ways researchers had difficulty measuring.

The only way to get close to knowing this stuff for sure is to take a large group of people and assign them randomly to, say, drink a glass of red wine a day or be teetotalers. Then you know the two groups of people are probably the same, and if they follow your instructions you can see how red wine makes a difference. Ideally, you would give them either fake wine (a placebo) or real wine so even the participants don’t know what they’re getting.

That’s called a blinded randomized controlled trial, and often it makes the “just-so” stories scientists tell themselves evaporate. For instance, there was an amazing story that Inuit people didn’t get heart disease from high-fat diets because they ate so much fish. This led to many studies, including randomized trials, that seemed to show taking fish oil supplements would reduce heart disease. But higher-quality randomized studies didn’t show this effect — until a prescription form with a highly purified fish oil did succeed. However, some researchers also doubted that study, because the placebo scientists used might have caused heart attacks. Yes, this is confusing, and that is the point: With nutrition, we need to be really careful about all we don’t know.

Based on an abstract of the new study provided to me by the American Heart Association, which runs the meeting where the results are being presented, it appears the researchers did not ask people if they were following time-restricted diets. What they did was look for people who only ate for a short period of time during the day based on two reports to the survey of what they ate.

“While informative, this study should be considered exploratory,” said Harlan Krumholz, a leading expert in the science of improving health policy at Yale. “We are still learning about how people can optimize their diets, and this study is more of a call for more research than something that should frighten people who find restricted eating a useful strategy.”

My own takeaway is that the study does mean that daily caloric restriction should be studied more — but we knew that. I don’t think it tells us anything else about these diets; it just illustrates how much we don’t know about biology. Some articles posited that maybe dieting this way leads to more loss of muscle mass. Sure, maybe.

But my other concern is that studies like this, and press coverage of them, can make people more skeptical about the things that we do know in medicine. People tend to think of science as a process where scientists do studies and find out the truth. But it’s more accurate to say that each study helps to make us a little less wrong, and a little more certain about what the truth might be. We live in a vast realm of darkness in which we have found scattered gems of truth.

This was a neat finding that should tell people working in nutrition to look harder at this topic. For everyone else, it doesn’t really say anything at all.

STAT encourages you to share your voice. We welcome your commentary, criticism, and expertise on our subscriber-only platform, STAT+ Connect

To submit a correction request, please visit our Contact Us page.