Every week it seems like a new study comes out with a bold new pronouncement. A cup of coffee can extend your life. Or it can hasten your death. A new treatment is trumpeted or certain foods or lifestyle are demonized. Yet many of these reports distort the facts, exaggerate findings or simply aren’t true.
The journal Heart published an editorial last week about how extreme runners face a higher cardiac risk. Many subsequent reports jumped on the idea that running was bad for you but failed to properly qualify the findings.
Earlier this fall, dozens of headlines touted the benefits of multivitamins after a study in the Journal of the American Medical Association found multivitamins cut cancer rates in older men. But the actual reduction in cancer was so small that one cancer expert said it doesn’t make sense to recommend people take multivitamins.
The reliability of health news is a serious matter. The inclination of medical journals and media outlets to only publish studies that have dramatic or controversial results fuels the trend toward sensationalism in health reporting.
“You can give very misleading ideas about what’s important in health and what isn’t,” said John Briffa, a British doctor who focuses on issues related to nutrition.
Changing how researchers, public-relations professionals, medical journals and health reporters do business is an enormous challenge. Like any other media publications, many medical journals need readers and subscribers to stay competitive and financially viable. That means they’re always on the lookout for research that can attract eyeballs. At the same time, the public-relations departments of universities, hospitals and other research institutions are keenly aware of the importance of getting their organization’s name in the media. The publicity can help ensure an institution maintains research funding and a good reputation. And then there are journalists, who have to evaluate countless press releases and journal abstracts to decide which one is most “newsworthy” and likely to draw readers.
“The [studies] that seem most attractive to most investors and peer reviewers and journals are the ones that find the most spectacular results,” said John Ioannidis, a professor of medicine at Stanford University.
“There’s a desire to possibly make one’s research look more meaningful than it is in reality,” said Briffa. “Evidence can be misrepresented, overhyped and be quite misleading.”
The problem is that while a medical study may be poorly designed, have a small sample size or is otherwise flawed, it can be presented in a press release as being a major breakthrough or discovery.
The American Journal of Cardiology weighed in on the issue last month, urging researchers to decrease the use of overly positive or dramatic language. Too often, the editorial says, researchers use declarative statements or other strong language that simply doesn’t match the reality of the study findings. And while most medical journals require researchers to disclose funding sources, particularly those that can pose a conflict of interest, very rarely are they noted on press releases.
When poor-quality study results appear in the media, it not only misinforms the public, but can lead to unnecessary worry. Take, for instance, a press release on a study from the University of Western Ontario this year that said egg yolks are as bad for health as smoking. The same group published a similar paper a year earlier that equated eating egg yolks with KFC’s gluttonous Double Down sandwich. The story appeared in media outlets around the world, but faced harsh criticism from nutrition experts who pointed out the study was flawed and the conclusions aren’t valid.
Journalists have a responsibility to look past the press release and evaluate the true merits of the study. However, Andreas Laupacis, executive director of the Li Ka Shing Knowledge Institute of St. Michael’s Hospital in Toronto, said the real issue may be with the way scientists and research organizations present their work.
“I think a lot of reporters do try to have a balanced view,” he said. “When scientists are talking to the press, they’re obviously wanting to give the public information about something they think is important … but they’re also selling the information a bit.”
As Ioannidis, who is well-known for a 2005 article he wrote titled “Why Most Published Research Findings Are False,” says, “We have to take every result with a grain of salt.”