Why the Media Loves Dumb Health Studies

Back in 2015, one journalist fooled millions into thinking chocolate made them slimmer. Well, he fooled news outlets from Shape magazine to the Daily Star to a few Huffington Post sites. While their audience was likely skeptical, the study was true enough. It was just trumped up by a man hoping to prove that health news studies are often sensationalized nonsense.

Pick a food and there’s a study out there telling you that it’ll boost your cancer risk. And that it’ll lower it. No joke: Vox just released an article featuring an entertaining chart proving it with such staples as corn, beef, milk, and eggs. How did the media get to a place where it just accepts the whichever dumb health studies get churned up?

Why the Studies Are Bad

Scientists run studies constantly. The vast majority of their conclusions are boring: Either they don’t find a correlation or they find one that everyone would expect. No one wants to publish a groundbreaking study on how cigarettes cause cancer after all. Already the market is skewed towards the attention economy.

But studies can be flawed themselves: that chocolate one had far too few subjects and measured a ton of factors. They only went with “chocolate makes you healthy” because they stumbled on a statistically significant coincidence within their limited data.

The real issue? Outlets that run these studies without checking them against a qualified professional or any past studies that might conflict with their cute, quick, what-do-you-know article.

The Solution: More Evidence and Fact-Checking

The dumb health studies are a rare breed, according to that Vox article. Any other data-driven sector, from political policies to education programs, relies on more strict evidence-based guidelines:

“Over here in the media, we are still largely in the 1990s when it comes to thinking about using research evidence — despite the fact that medical journalism influences human health as much as policy or even medicine.

Instead of trying to translate what the best-available research evidence tells us about how to live, we report on the latest studies out of context, with little focus on how they were designed, whether they were unduly conflicted by study funders, and whether they agree or disagree with the rest of the research.

We often ignore systematic reviews (maybe because some of us don’t even know they exist). And we mislead the public by pretending that “the latest research” holds definitive answers, instead of acknowledging the incremental nature of science.”

Actually clearing those hurdles isn’t as easy as pointing out they exist, of course. Vox’s solution — publishing “evergreen stories that explain the body of research” — isn’t as plausible for most cash-strapped outlets as it is for Vox. But if news outlets focus less on running counter-intuitive news, they’ll be better prepared to examine the data behind the dumbest of the dumb health studies.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Adam is a writer at Tech.co and has worked as a tech writer, blogger and copy editor for more than a decade. He was a Forbes Contributor on the publishing industry, for which he was named a Digital Book World 2018 award finalist. His work has appeared in publications including Popular Mechanics and IDG Connect, and his art history book on 1970s sci-fi, 'Worlds Beyond Time,' is out from Abrams Books in July 2023. In the meantime, he's hunting down the latest news on VPNs, POS systems, and the future of tech.
Back to top