I write a monthly (first Sunday) Food Matters column for the San Francisco Chronicle. Today’s is about the difficulties of doing nutrition research. The Chronicle headline writer titled it, “Be skeptical of food studies.” Oops.
I’m not skeptical about Food Studies—the capitalized field of study—at all. My NYU department started undergraduate, master’s, and doctoral programs in Food Studies in 1996 and they have flourished ever since.
The column is about small-letter food studies, meaning nutrition and food research studies:
Q: You were quoted saying that you didn’t believe newspaper reports linking diet sodas to an increased risk of stroke and heart disease. How do you decide whether research is good or bad?
A: This one was easy. I didn’t think it made sense. Mind you, I’m no fan of diet sodas. They violate my rule never to eat anything artificial. And I don’t like the taste of artificial sweeteners.
Whenever a study comes out claiming harm – or benefits – from eating a single food or ingredient, I get skeptical. That’s why I also questioned these recent study results: high-fructose corn syrup (HFCS) makes rats gain weight faster than sucrose (table sugar); zinc supplements prevent symptoms of the common cold; and pomegranate juice has greater health benefits than any other kind of fruit.
When I read about single-factor studies, I want to know three things: Is the result biologically plausible? Did the study control for other dietary, behavioral or lifestyle factors that could have influenced the result? And who sponsored it?
Plausibility: The diet soda study used a self-reported questionnaire to find out how often people reported drinking diet sodas. Nine years later, people who reported habitually drinking diet sodas had a 60 percent higher rate of stroke and heart attacks. The rate was somewhat lower when controlled for age, sex, race, smoking, exercise, alcohol, daily calories and metabolic syndrome.
Leaving aside the unreliability of self-reported dietary intake, the study raises a more important question. Was it designed to investigate the link between diet sodas and stroke or was this just an accidental finding? The questionnaire undoubtedly asked hundreds of questions about diet and other matters. Just by chance, some of them could be giving results that look meaningful. And the increase in stroke risk seems surprisingly high for something not previously known to be a stroke risk factor.
Mostly, I can’t think of a biological reason why diet sodas might lead to cardiovascular disease unless they are an indicator for some other stroke risk factor such as obesity, high blood pressure or binge drinking. It would take a study designed to test this idea specifically – and a good biological explanation – to convince me that diet sodas cause strokes.
The plausibility issue also rises in the HFCS study. Again, I’m not a fan of HFCS – we would all be healthier if we ate less sugar – but from a biochemical standpoint, HFCS and table sugar are pretty much the same. They have similar amounts of glucose and fructose, are digested as quickly and are metabolized the same way. Even the average amounts consumed are about the same. That soda companies are replacing HFCS with sucrose is strictly about marketing, not health.
Controls: The zinc-and-colds study was a comprehensive review (a “meta-analysis”) of previous studies done since the first one in 1984. Eleven studies have showed some benefit; seven have not. All of them were placebo controlled, double-blind. This meant that half the participants were given a dummy pill, and neither participants nor investigators were supposed to know who was taking what.
But in some studies, the zinc takers complained about the taste of the pills, hinting that they knew what they were.
I’ve heard this before. In the early 1970s, National Institutes of Health investigators did a study of vitamin C and the common cold. They got about 300 NIH employees to take either vitamin C or a placebo, double-blind. The tantalizing result: People taking vitamin C reported fewer colds and milder symptoms than people taking placebos.
Alas, many participants withdrew from the study before it ended. When asked why, they admitted tasting the pills. The investigators reanalyzed the results. Bingo! No matter what pill the participants actually took, those who thought they were taking vitamin C reported fewer colds and milder symptoms.
If the studies were not really blinded, the zinc results are questionable. I have no doubt that many people feel better when they take zinc supplements, but I’m not sure whether that’s because of something zinc really does or just its placebo effect.
Sponsorship: Vested interests influence the design and interpretation of studies. The best-designed studies control for factors that might influence results. Even so, their results require interpretation. Interpretation requires interpreters. Interpreters bring their own biases into the interpretation.
I mention pomegranate juice because one of its major producers sponsors studies to hype its benefits. Yes, pomegranates are delicious, but antioxidant powerhouses? So are most fruits. Pomegranates may have high antioxidant activity, but compared with what? Its maker does not say. Sponsored research is also about marketing, not health.
Nutrition research is hard to do, which makes study design and interpretation particularly challenging. Nutrition is a thinking person’s field, requiring careful analysis at every step. When you hear a result that sounds too good to be true, it usually is.