A PhD in Science Communication


I’ve always been fascinated by the human body and how it works. I used to stay up past my bedtime, poring over my grandparents’ medical textbooks by torchlight under the covers. In high school, I went to all the optional sexual health sessions and reported the intel back to my shyer classmates. For my undergraduate degree, I majored in human anatomy, but I also took classes in biochemistry, microbiology, epidemiology, and bioanthropology. Studying human health science made me happy, but it wasn’t long before I started becoming disillusioned with the gulf between what I was doing in the lab and the interactions I was having outside of the “ivory tower.”

I’d spend the afternoon lovingly tending to the stem cells I was using in an experiment, then spend the evening listen to some bloke at the pub telling me that actually he drives better after a few drinks. A family member approached me for advice on her baby’s health but adamantly refused to get them vaccinated. A classmate told me that they never finish a course of antibiotics, choosing instead to save a few days’ worth of capsules to self-medicate with when they get a cold. A primary school teacher proudly boasted to me that he never eats “anything with chemicals in it,” leaving me to wonder how he has survived this long without downing any dihydrogen monoxide (H2O).

Studying science communication as a PhD student

Fortunately, by the time I was ready to graduate with my BSc in Anatomy, the University of Otago had established the Centre for Science communication and I was accepted as a grad student. I chose to focus on misinformation about sexual and reproductive health, as the stigma around sexual and reproductive health issues creates conditions under which misinformation can flourish. I’ve never forgot about all the anxious, whispered questions from friends and classmates back in high school, and it’s oddly fitting that I’ve ended up (again) dispelling the same myths and misconceptions fifteen years later.

What does that look like? 

One of my more recent studies looked at how people perceive and evaluate peer-reviewed sources of information if those sources affirm or challenge their beliefs.

I asked participants to read short abstracts comparing the mortality rates between two controversial health issues. One group read the text from a real, peer-reviewed study. The second group read the same abstract but with the names reversed, meaning that the findings of the study were also reversed. The third group read a version of the abstract where the real medical conditions had been swapped with made up illnesses. They were then asked to evaluate the quality of the study those abstracts represented. The point of this study was to see if their evaluations of the abstracts were driven by the quality of the study design or by their preconceived ideas about the controversial medical issues.

Participants gave significantly more favourable ratings to the abstracts that affirmed their personal beliefs even though all three texts were otherwise identical. Likewise, when participants read an abstract for a study with findings that challenged their beliefs, they were much more likely to rate that study as untrustworthy, inaccurate, methodologically flawed, and biased. As a result, the control abstract comparing two fictional medical issues – chronic and acute “adeolitis” – received the best ratings overall, simply because none of the participants had preconceived opinions about “adeolitis.”

The participants in my study were exhibiting a phenomenon called “motivated reasoning,” the catchall term for a range of neat little tricks our minds perform to avoid cognitive dissonance (Kunda, 1990). More specifically, they were engaging in a behaviour called the “biased evaluation of evidence,” in which people (conscious or unconsciously) pick apart sources of information they don’t like, looking for reasons to dismiss them. There’s evidence to show that people with greater levels of science literacy are especially prone to this, possibly because a greater familiarity with scientific methodology makes it easier to spot even minor weakness or limitations in a piece of research (Drummond & Fischhoff, 2017; Kahan et al., 2012).

Scicomm in the community

workshop 1-min

During my PhD, I’ve also found opportunities for more hands-on science communication. Between New Zealand’s multiple Covid-19 lockdowns, I held mask-making workshops in my community. I brought along printed copies of studies comparing the filtration efficacy of different materials so that we could have discussions about the pros and cons of different fabrics while we worked. We also had discussions about the difference between airborne and aerosolised droplet transmission and how different fabrics and mask designs could work to slow the spread of illness. I later helped run workshops where I showed people how they can make reusable cloth menstrual pads while we talked about the environmental impact of disposable products.

A great deal of scientific research is funded by the public and performed by researchers who received their educations from public schools and universities. However, much of the information gained from that research remains inaccessible, behind paywalls and in technical jargon that even scientists in neighbouring disciplines might struggle to understand. Why should we trust something that we cannot read or verify for ourselves? Further, when science is communicated in a format that is accessible, that communication often neglects the emotional and cultural contexts that shape what people know and believe about the world. It is vital to understand what people know – or, at the very least, believe – to be true so that we can tailor our communications to better meet the needs of the people we are talking to.

mask 1-min

Emma Harcourt is a PhD student in the Department of Science Communication at the University of Otago. Her research focuses on misinformation about sexual and reproductive health. 

 


References and Resources

References

Drummond, C., & Fischhoff, B. (2017). Individuals with greater science literacy and education have more polarized beliefs on controversial science topics. Proceedings of the National Academy of Sciences of the United States of America, 114(36), 9587-9592. doi:10.1073/pnas.1704882114

Kahan, D., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732-735. doi:10.1038/nclimate1547

Kunda, Z. (1990). The case for motivated reasoning. Psychological bulletin, 108(3), 480.

More resources on the Addgene blog

Summer SciComm Series: Masters of SciComm
Summer Scicomm Series: Modes of Communication
Careers in Science Communication: Science Writing





Source link

Leave a Comment