Communicating Science and Medicine in an Era of Fake News
How critical analysis, listening and empathy can help combat the disinformation epidemic
By Diana Dopfel
Communicating scientific information is complex. Most of us aren’t scientific experts, otherwise there would be a lot more science in the world. Scientific information is also constantly evolving. Our understanding of the impacts of climate change has evolved dramatically in the past four decades, even if some still question the data. After all, the top scientists in the world once thought the earth was flat, and now we know definitively it’s not. (Though even on that subject, some harbor doubts.)
People easily can – and no doubt have – spent years of their lives researching how to best communicate complicated scientific information to uninformed, skeptical and possibly hostile audiences. (Yes, the “science of communicating science” is a thing.) Add to it a current environment in which fake news dominates and simple truths seem to be subject to interpretation, and it makes the job even more complex.
While we cannot offer definite solutions to the challenge of getting people to understand science better, we can offer these observations about the current environment in which we are doing our jobs as communications professionals. It takes careful work to earn anyone’s trust these days, and for good reason, given the current media environment.
We should all be skeptical when consuming news.
The older you get the more “scientific data” about-faces you will have experienced in your life. The margarine vs. butter debate is the first example that comes to mind for me. Others may recall the time when smoking cigarettes was good for your health. Tobacco companies included physician endorsements in their marketing materials; how could it not be true?
A more recent example was explored in a November 2018 episode of the podcast “Undiscovered,” when hosts Elah Feder and Annie Minoff discussed a study about vitamins and the resulting news coverage. The Atlantic, among others, reported that “Vitamin B6 and B12 Supplements Appear to Cause Cancer in Men.”
However, that headline is not what the research concluded at all. Unfortunately, in this case the press release from The Ohio State University Comprehensive Cancer Center about the research was itself misleading. Read the headline and the first paragraph and you will conclude that if you’re male you have a higher risk of lung cancer if you take high doses of B6 and B12. You have to read further for the insight included in the researcher’s quote:
“Our data shows that taking high doses of B6 and B12 over a very long period of time could contribute to lung cancer incidence rates in male smokers. This is certainly a concern worthy of further evaluation.”
What the researchers actually found was that men who were current smokers who took high doses of B6 or B12 had an increased risk of lung cancer, a risk higher than the risk they already had due to being smokers. Did the researchers crunch the numbers on whether the B6 or B12 increased the risk of lung cancer in non-smokers? No. Because there weren’t any numbers to crunch.
The Undiscovered hosts briefly turned the mirror on themselves, in an acknowledgment of the responsibilities (and power) journalists wield. Was this an example of poor journalism or a misunderstanding? Unfortunately, misunderstandings do happen, even when everyone involved has the best intentions. According to the researcher who was quoted in the press release:
“Ultimately what happens though, is that no matter what I say in a three-hour phone interview…it may come down to “Brasky says these things cause cancer.” And I’m like, what? No. We had a long conversation.”
Interestingly, in a 2015 Q&A the author of The Atlantic article discussed a differentiation between “healthcare reporting” and “being voices in health media.” He said, “I want to tell good stories and to be the person that shows that these things are complex.” Unfortunately, the headline didn’t give the impression that the story was complex, and we all have short attention spans these days. For the most important information, we need to read carefully and do our own due diligence.
Trust issues relating to science and medicine are not new, and they often come from a legitimate place.
Sometimes the hurdle we need to jump as communications professionals is related to deeply rooted experiences. Case in point: Visit an HIV clinic in one location in a city and you may see patients on the younger side who are active in their healthcare and adherent to their medications. Visit the same clinic’s other location and you may see a much sicker patient population. Grown men under 100 pounds, who could be well controlled on medication but aren’t. They show up only when they are too sick to turn anywhere else.
The driver of this particular health disparity is not necessarily socioeconomic; for some, it may stem from a shared history of distrust of the healthcare system. Remember the Tuskegee experiment, which was exposed by investigative journalism 40 years after it began? The research subjects were not informed of the true nature of the study, and they were not offered treatment even when effective treatment was available. It truly ruined lives. It was also not too long ago when innocent groups became scapegoats for the HIV epidemic, and some experienced terrible treatment when seeking medical care.
Venture further into history, and we are reminded of the smallpox vaccine at the turn of the 20th century – the first vaccine ever created. Yes, the vaccine could inoculate you, but it was not a secret that it could also kill you. And yet some people in this country were forcibly vaccinated at gunpoint.
When communicating with certain audiences, we must keep their experiences and points of view in mind. Some traumatic events that may seem like they were long ago can live long in our memories. And, extreme pressure to vaccinate still happens in some places around the world. There are cultural perspectives people may carry with them that can’t be ignored.
Given all of this, it is no wonder that there was such an outcry about the Chinese researcher He Jiankui, who claimed he altered the genomes of newborn twins – by no means consented research subjects – using CRISPR technology. The scientific and bioethics community is watching closely, and these sorts of conversations are going to continue for some time.
Lillie Tyson Head, chairwoman of Voices for Our Fathers Legacy Foundation, discusses the Tuskegee experience, in which her father was enrolled:
“The fact that the study was carried out by ‘professionals that had the code of ethics and the swearing of healing and helping and aiding’ has made her think twice about trusting health care providers, she said. The study ended decades ago, but Head said ‘it still has an impact on how your feelings are and how your trust is toward (health) professionals,’ adding she sometimes feels apprehension about whether she is being told the truth.”
Source: 2017 article in USA Today
Bring empathy to the work that we do, and the language that we use.
Many experts agree – don’t call people “science deniers.” Although I’m thoroughly not aligned with the viewpoints of “anti-vaxxers” and “climate change deniers,” we won’t have a productive dialogue or reach certain groups with our messages if we use unproductive language or labels. For example, some of these “anti-vaxxers” are nervous, concerned parents who want to do the best for their children. There’s a good chance they’ve already been shut down by their pediatricians because we know many doctors either don’t have or won’t take the time to discuss these issues in depth. Many clinicians don’t want to take the time to discuss something that they believe to be as “case closed” (based on training and scientific evidence) as whether or not children should be vaccinated.
Of course, research shows that having a productive dialogue often isn’t possible (see the reams of studies and whole books on topics such as confirmation bias). But we owe it to people to not turn them into the “other” with our language. Being careful with our language, and empathetic in our communications, can only help, not hurt.
It’s our job to make sure that we – and our clients – do not contribute to the problem.
While we cannot necessarily force our clients, bosses and/or teams to do anything, we can and should guide them and explain our point of view about why certain statements or language might lead to unintended consequences. We may not be bioethicists, but we need to play the role of communication ethicist. We need to do our part to make sure scientific information is being communicated clearly, completely and accurately. What exactly “clearly” means should be defined in terms of your specific target audience and the language and level of detail that is appropriate for them. “Completely” means with context, not ignoring the larger picture if there is one. “Accurately” means correctly, and in whatever way has the best likelihood that your target audience will understand the information – and the big picture of that information – as intended. Unfortunately, some common tools like press releases lack the structure to provide much context. We therefore need to take the time to provide thorough briefings to reporters and other stakeholders, to share complete data and supporting information where appropriate and to be transparent about the broader conversation on a topic, making suggestions for how they might gather multiple points of view.
If we ever think that a client is misguiding its stakeholders and/or being less than transparent, it is time to have a hard look at whether they should be a client. When the communications team fails, there are people like journalist Adam Feuerstein writing articles like Acadia Pharma conjures black magic to spin a failed depression drug trial and Beyond the headline: Mirati undersells concerning data on its lung cancer immunotherapy. We appreciate our journalists doing this critical work. But if the communications team was doing its job well, no such articles would need to be written.
In media relations, strive for high standards on both sides.
We can’t do this alone. Even if we take all the right steps described above, we can’t write the headlines or the articles. We need some help from our journalist friends:
- Present information in context. Don’t just share what the press release reported, look for past news on the topic and compare and contrast, as appropriate.
- Talk to the investigators. If something in a research publication/abstract and/or press release doesn’t seem quite right, request clarification. Secure additional peer review/perspectives if possible.
- Translate things like p-values and sample sizes into terms your audience can understand. If you can’t explain it completely, at least make a caveat that there is more to the story.
- Explore alternative viewpoints. Especially for controversial topics, look to people who may have switched sides on a complicated issue and provide both points of view. These are people who have likely done a lot of the work understanding and weighing the opposing arguments.
- Present information in an appropriate forum [for that information]. Don’t spend five minutes discussing something that requires an hour. Don’t spend 500 words on something that requires 2500.
Not to oversimplify, but the common themes in all of the above are critical analysis, listening and empathy. When people are exposed to misguided and/or flat out untrue information that is communicated as truth, it can have truly devastating impacts that can take a very long time to undo. Now more than ever, we need to take the time to truly understand our audiences and tailor our communications as much as possible to the people we are educating, informing and/or inspiring, to avoid miscommunication wherever possible and tell a complete story.