People who trust science are more likely to be fooled into believing and spreading pseudoscience, finds a new article in the Journal of Experimental Social Psychology.
Pseudoscience is false information that refers to science in the broad sense or to scientific terms, research or phenomena. Over the course of four experiments, the researchers asked American adults to read news articles created for this study that focused on one of two topics: a dangerous virus created as a biological weapon or the carcinogenic effects of genetically engineered organisms. modified, or GMO.
The experiments reveal that study participants who indicated they had higher levels of confidence in science were more likely to believe the fictitious account if it contained scientific references. These people were also more likely to agree that stories highlighting pseudoscience should be shared with others.
On the other hand, people who demonstrated a better understanding of scientific methods were less likely to believe what they read and say that it should be shared, whether or not the information was attributed to science.
The researchers note that their findings conflict with ongoing campaigns to promote confidence in science as a way to fight misinformation about the COVID-19 pandemic, wearing masks and COVID-19 vaccines. Confidence in science alone is insufficient, says lead author of the article, Thomas C. O’Brien, a social psychologist who studies conflict resolution and trust in institutions, most recently at the University of London. ‘Illinois Urbana-Champaign.
“It is important to note that the conclusion of our research is not that trust in science is risky, but rather that, applied broadly, trust in science can make people vulnerable to belief in pseudoscience. “, write O’Brien and his co-authors Dolores Albarracín, recently psychologist. appointed Director of the Science of Science Communication Division at the Annenberg Public Policy Center at the University of Pennsylvania, and Ryan Palmer, former University of Illinois Urbana-Champaign graduate researcher.
The researchers note that it is difficult for the lay public to fully understand complex topics such as the origins of a virus or how GMOs might affect public health. They suggest that a more sustainable solution to tackling disinformation is to help the public develop a type of scientific literacy known as methodological literacy. People who understand scientific methods and research designs can better assess claims about science and research, they explain.
In an email interview, Albarracín pointed out that full confidence in science can lead people who would otherwise flout conspiracy theories to believe them if presented alongside scientific content such as quotes from scientists. and references to university studies.
She added that skepticism is a healthy and essential part of science.
“The solution to climate change denial, irrational fears of GMOs or reluctance to vaccinate is not to preach trust in science,” wrote Albarracín, recently appointed professor at Penn Integrates Knowledge University.
“Trust in science has a critical role to play in increasing public support for science funding, improving science education, and separating trustworthy from unworthy sources. confidence, ”she continued. “However, trust in science does not cure all ills and can create susceptibility to pseudoscience if to trust means not to be critical.”
How they conducted the study
To test whether trust in science makes people more sensitive to pseudoscience, the researchers conducted four experiments with nearly 2,000 American adults in total. They recruited volunteers for two experiments through Amazon Mechanical Turk, an online crowdsourcing platform. Survey research firm Dynata provided samples for the other two experiments.
Several hundred people participated in each experiment. It all started and ended in 2020, varying in length from two days to just over a week.
For each experiment, the researchers randomly assigned study participants to read a news article and complete an online questionnaire asking them, among other things, if they believed in the article and if it should be shared with. others.
In one of the articles, scientists from prominent universities claim that the fictitious “Valza virus” was created in a laboratory and that the US government hid its role in its creation as a biological weapon. Another story mentions a real study supporting the idea that mice develop tumors after eating GMOs, but does not note that the paper withdrew it in 2013. For comparison, researchers asked some people to read versions of news stories that featured activists as sources of information, not scientists or research.
To gauge participants’ level of confidence in science, the researchers asked them to indicate whether they agreed with statements such as “Scientists usually act truthfully and rarely forge results” and “The Bible. provides a more solid foundation for understanding the world than science does. . “
People also answered multiple choice questions designed to measure their understanding of scientific methodology.
In the final experience, participants responded to a writing prompt intended to put them in a certain state of mind before reading the article assigned to them.
One of the writing prompts was to put people in a “trust in science” mindset. He asked them to give three examples of how science had saved lives or otherwise benefited mankind.
Another prompt, aimed at inducing a ‘critical appraisal mindset’, asked participants to give examples of people ‘needing to think for themselves and not blindly trust the media or other sources tell them ”.
The remaining prompt, created purely for comparison purposes, focused on unusual or interesting landscapes.
The results suggest that respondents who considered the benefits of science before reading their article were more likely to believe pseudo-scientific claims than respondents who gave examples of people who need to think for themselves.
An important limit
O’Brien and Albarracín noted that study participants were not given the opportunity to verify the article they were reading against other sources. In real life, some participants may have tried to verify the claims by, for example, comparing their report to coverage from other media.
Albarracín wrote in his email that good source verifiers would have discovered that the GMO study mentioned in one of the stories had been withdrawn by the journal that published it. According to the newspaper’s retraction statement, a closer examination of the details of the study revealed that firm conclusions could not be drawn due to the small sample size.
Suggestions for journalists
The journal’s findings have important implications for newsrooms.
Albarracín encouraged reporters covering the research to describe their process for evaluating the quality of the research. Journalists can also explain how the design of a research study and the scientific methods used may have influenced the results, she wrote.
Doing these things can help the public learn how to assess scientific claims. Journalists “could systematically report the doubts and uncertainties as well as the strengths and weaknesses of the method, in order to model this thought process for their audience,” Albarracín wrote.
It would help, she added, for journalists to write more about the distinction between science and pseudoscience.
O’Brien urges journalists to learn terms they don’t understand but frequently come across in academic literature. This will help them better understand the research and explain it to their audience.
“Like, what does ‘randomization’ mean?” ” he asks. “What is statistical power and what does it mean to have converging evidence? And what is peer review and [what are] the limitations of peer review? These are certainly things that should be of interest to journalists.