upper waypoint

Does Health Journalism Do More Harm Than Good?

Save ArticleSave Article
Failed to save article

Please try again

Health News Review publisher Gary Schwitzer.

If you are one of the millions of people who has seen John Oliver's recent skewering  of the way scientific and health studies are reported by the media ("New study shows drinking a glass of wine is just as good as spending an hour at the gym"), you have probably laughed yourself into an uproarious stupor by now.

If you are an actual health care journalist, however, perhaps you have watched it with somewhat less abandon, meaning, peeking through the fingers on your hand, which was covering your face.

One of the consultants on the Oliver segment was Gary Schwitzer, the publisher of Health News Review and a longtime health journalist who now teaches at the University of Minnesota's School of Public Health. For 10 years, now, Health News Review has been critiquing and rating media coverage on medical studies and health issues. The site's reviewers, many of whom are medical professionals and academics, assess each news item and press release on 10 criteria, including apparent grasp of the research's validity, use of independent sources, and whether any new approach described is compared to existing alternatives.

Not that I was overly eager to make my presence known to Gary, but since "Future of You" covers some of the latest developments in medicine and health, we wanted to know just what was eating him about health reporters, anyway.

Sponsored

Here is the transcript of our conversation, edited for length and readability.

Jon Brooks: What are some of the worst-reported health stories you've seen?

Gary Schwitzer: There’s one they used in the John Oliver program -- a study that was reported as "smelling farts cures cancer." At the time, I did a Google search and got more than 300,000 returns on "farts cure cancer."

What is especially  troublesome is this came from the most basic research study that was not ready for prime time, about some biochemical responses to hydrogen sulfide in the lab. But the news release used the word “flatulence” in the first sentence, and bingo, the media was off and running. On the web, click rate is the coin of the realm, and using "fart" means you’re going to win the race that day. And it all came from a news release about research that is so far from any human application it's ridiculous.

More seriously, I look at a lot of media stories about screening tests. We often end up only getting stories about the benefits of screening. We ought to be getting messages about the trade-offs involved, because while there is something you stand to gain in any screening test, there is also something you stand to lose. When you cast the net wide, you are going to catch what doctors often call "incidentalomas," causing further testing and treatment that is  unnecessary.

If you have been indoctrinating the public to think everybody ought to be screened for everything, you are already framing the issue as, "If you find something, you better do something."

JB: What are some of the common errors that health reporters make?

GS: Single source journalism is not good journalism in any field, but in the area of health care it’s malpractice. There are conflicts of interest around every corner. If you’re not turning to independent experts, chances are you’re going to pass along a conflicted message from somebody with a vested interest. In health care news, if your mother tells you something, you'd better check it with five sources. Another problem: We treat anything published in a journal as if it's from Moses coming down the mountaintop with a set of stone tablets. Journals were never meant to be sources for the 24-hour news cycle -- they are meant to be a forum for discussion among scientists.

We are entitled to eavesdrop on those conversations, but we shouldn't do it if we don't know the limitations and caveats. What's the quality of the evidence? What do independent experts say? Maybe this was statistically significant but was it clinically significant? Did it really make a difference in somebody's life? Anybody writing about these studies on a platform that reaches people has power, and can do more harm than good.

JB: So looking over the state of things today, would you say that health journalism is causing more harm than good in terms of informing the public?

GS: Yes. We are seeing some of the best health care journalism, but far too much of the worst. There are these mountain peaks of excellence, like what ProPublica and Kaiser Health News do, and a lot of what we get on public radio.  (Editor's note: He really said this, without me saying "ahem.") But the valleys between these peaks are becoming wider and deeper. I call it the daily drumbeat of dreck, and it overwhelms the occasional peaks of excellence and the good that is done.

JB: If you're a news consumer, what are some red flags you should pay attention to in assessing the validity of a health news story?

GS: I think our 10 review criteria for journalists are good for the public, too, in evaluating claims they hear in the media, or from any source, including your own doctor. And if you ever hear that something "might be a game changer" or "might be a new standard of care," I invite people to substitute "might not be," because you'd be on equally safe or shaky ground.

JB: In thinking about the "farts-cure-cancer" story, do you think media people at universities and research institutions need to be more responsible?

GS: Obviously, but let's not stop there. There are many different stages in the dissemination of health information to the public. In many cases the media people do go back to the researcher, and the researcher is enamored with how his or her work has been made to look good. Everybody wins in their eyes. More publicity means they might get more research funding, and their home institution is happy because of that publicity and funding. And all through this food chain we lose sight of the health care consumer at the end of the line, who doesn’t know about all this spin happening upstream, that much of what they’re being fed is contaminated.

The chocolate milk-concussion news out of the University of Maryland is a classic case. It started with a news release saying a particular brand of chocolate milk was helping kids who had concussions from football, but it mentioned no data or details. We reviewed the press release then went to the researcher and university, asking for the data, but we got stonewalled.

Soon other news organizations picked up on the story, and the university announced an internal review. In March, it released its report, and it was a scathing self-reflection, in which they admitted the research was schlocky, conflicts of interests that should have been reported, and no clear lines of authority for who approved news releases. In this case the researcher himself had been given the final say on a release that made his own work look more sensational than it really was. That's probably the most dramatic example we have written about concerning the contamination of the food chain.


At that point, we ended the conversation, and I felt a deep desire for a transfer to the sports desk. (We don't have one.)

But there was one thing I had forgotten to ask Schwitzer during our interview: Did any news organizations actually take the bait of the chocolate milk press release?

Health News Review found two news stories based on the release, he said in a follow-up email, one of which came out in the midst of all the negative media coverage about the incident.

Schwitzer said, "It was as if that writer lived in a cave."

Heeding Schwitzer's guideline on eschewing single-source journalism, I asked our own veteran health journalist, Lisa Aliferis, the editor of KQED's State of Health blog, what she thought.

"Spot on," she said of Health News Review's 10 criteria for journalists. "While constraints on time don't allow every story to take into account all 10, it's something we should aspire to."

She said consumers of health news can get good information if they pick the right sources -- those which consistently report the nuances of any particular treatment or "breakthrough."

Sponsored

"If it sounds too good to be true, it is," she said.

lower waypoint
next waypoint