It’s no secret that mass-media reporting on dietary supplement studies is often negative or misconstrued. Unbeknownst to the media, consumers and even some retailers, there are many reasons a study’s findings can be skewed. There are key criteria to consider that can help you understand what’s most important.
Pay attention to the questions asked. Did researchers ask questions relevant to the product? Did they conduct the study to answer a question or design the questions to fit the answer they wanted to attain? This is where you can also look at who conducted the study and developed the questions. According to Todd Runestad, senior supplements editor at New Hope Network and Natural Products INSIDER, there is often a study bias. For instance, in a well-known study from 2002 of St. John’s wort, the herb was analyzed for its effectiveness against major depression, yet it was only ever thought to help with mild or moderate depression, and rarely used as a stand-alone treatment by herbalists but rather formulated with additional herbs. When the study didn’t show results for major depression, the headline simply read that it didn’t work for depression. “They tested it against a condition it was never intended to treat,” says Runestad. “And they compared it to Zoloft and a placebo. Zoloft was not statistically significant either.” But that is not what the headlines said, so the takeaway for consumers was that St. John’s wort doesn’t work.
Compliance is key. “Compliance is a large factor, and it deals with how the study is set up and how tight the study is run,” says Chris Crawford, vice president of education at LifeSeasons. “It is the edict as to how things occur.” Yet, a lack of compliance in a study may not be reported or may be mentioned only in the fine print of the study’s limitations section. In 2007, CNN’s chief medical correspondent Dr. Sanjay Gupta reported on a large 10-year study looking at the effects of antioxidants in reducing heart disease risk in women. Gupta stated that while antioxidants caused no harm, they also had no effect. Yet a closer read of the study showed that only 68 percent of the subjects were compliant, says Runestad. Of those who were compliant, there was an overall 23 percent reduction in the combination of myocardial infarction, stroke or cardiovascular death. “The study should have said, ‘antioxidants are the best thing you can take for cardiovascular protection; they will reduce your risk 22 percent to 27 percent.’ But that didn’t happen,” says Runestad. He adds that the positive results should have at least figured prominently in the study’s abstract or press release.
Identify the limitations. When reading a new study, Andrew Shao, PhD, interim senior vice president of scientific and regulatory affairs at Council for Responsible Nutrition, suggests going straight to the discussion section to identify the limitations. “Here the authors acknowledge that the relevance of the study and the outcomes are limited due to these limitations,” says Shao. “They can’t overstretch their conclusions.”
Look closely at study size. A recent CBD-focused study conducted on mice reported that 75 percent of the mice suffered from adverse effects. This sounds alarming, except for the fact that it was four out of six mice. “What can we do with data from six mice?” asks Shao, who notes that this isn’t a significantly sized set from which to extrapolate data relevant to the whole consumer population.
It’s also important to know the context. Take a recent study published in the Journal of Adolescent Health, which examined adverse event reports (AERs) recorded in the FDA’s food and dietary supplement AER database. You may have seen The New York Times headline: “Supplements for Weight Loss, Sexual Function and Muscle Building May Be Deadly.” Yet, according to Shao, the study actually revealed a very small number of AERs (1,329), considering the overall number of supplements on the market.
Pay attention to supplement dose. In 2015, there was an omega-3 study that said this supplement offered no benefit for stopping cognitive decline, recalls Runestad. “Newsweek said omega-3 studies are a waste of time. But it turns out, dosage matters. They gave study participants 350 mg of DHA and 650 mg of EPA, yet the cutoff for efficacy starts at one gram.” According to Runestad, if participants took more than one gram, they would have likely seen benefits.
Determine if there is a measure of baseline nutrient levels. Also typical in dietary supplement or nutritional studies is not having a baseline measure of a subject’s nutrient levels. “It’s a different question if you are deficient in the nutrient to start versus if you supplement,” says Shao. “The outcome will be different if you are adding more of a nutrient to a diet that is already replete versus one that is deficient.”