Können wissenschaftliche Fakten Meinungen zu kritischen Themen verändern? Und welche Rolle spielt die Wissenschaftskommunikation dabei? Arwen Cross befragte Joanna Huxster, die zu diesem Schwerpunkt forscht.
What’s the relationship between understanding science and trusting it?
Wenn die Öffentlichkeit die Wissenschaft einfach besser verstehen würde … würde es leider auch nichts bringen. Das ist ein Ergebnis, dass sich aus der Forschung der Wissenschaftskommunikation ergab. Die vorherrschende Annahme in der Wissenschaftskommunikation war, dass wenn die Öffentlichkeit nur gut genug informiert ist, dann wird sie auch die richtigen Entscheidungen treffen. Sozialforscher haben jedoch festgestellt: Meinungen lassen sich nicht immer durch wissenschaftliche Belege bewegen – zumindest bei kontroversen Themen wie Grüne Gentechnik, Atomkraft oder Klimawandel. Grund dafür ist die Bestätigungstendenz – wir glauben Informationen die unsere vorgefertigte Meinung unterstützen. Wie können Wissenschaftskommunikatoren diesem Problem entgegentreten? Wir haben zwei Experten gefragt – Joanna Huxster von der Bucknell University, USA und Craig Cormick, Vorsitzender der Australian Science Communicators.
Facts don’t change minds, but maybe scientific literacy can
Scientists and the public have very different attitudes to scientific issues as you can see in this study from Pew that asked Americans and AAAS scientists about their views. Is the attitude gap driven by differences in knowledge? If people understood science better, would they drive their cars less to reduce their carbon footprint, eat genetically modified food and support nuclear power? Probably not. The idea that people ignore scientific findings because they don’t understand them is known as the deficit model of science communication. The problem with this model is that knowledge is not the key to how we see science – our values play a big role too. A green group might say people should listen to scientists’ warnings about the risks of climate change, but be adamant that scientists have underestimated the risks of genetically modified crops. Their values affect which scientific information they trust and giving them with more information probably wouldn’t change their minds. In fact Dan Kahan’s work has shown precisely the opposite effect – people become even more strongly attached to their views when presented with evidence that contradicts them. Psychologists call this confirmation bias.
How can science communicators avoid confirmation bias? What strategies can we use to communicate effectively with people with different views about science? I asked two experts Dr Craig Cormick (Australia) and Dr Joanna Huxster (USA). In this first article, we hear from Joanna Huxster.
Dr. Joanna Huxster is an environmental studies scholar who researches Public Understanding of Science with Prof. Matthew Slater at Bucknell University, USA. At the AAAS Meeting she presented her work looking at whether having a better understanding of scientific content and methods (i.e. stronger scientific literacy) predicts greater trust in science. In this interview she shares her findings about how understanding aspects of the scientific process affects people’s attitudes to issues like climate change. She suggests how science communicators might utilise these findings in their work.
Why are you interested in public attitudes to scientific issues?
There are several reasons for my interest in public attitudes toward science. The most important is that public attitudes affect behavior at the individual, cultural, and political level, especially in the United States. There are indications of an anti-intellectual and anti-science movement here in the U.S., and so it is critical to track public attitudes towards science. Attitudes and understanding of science influence how people behave and how they vote, which in turn affects policies on healthcare, technology, and environmental protection. I am particularly interested in public understanding of and attitudes toward climate change, as I believe this is the single most pressing issue humanity faces at this time.
Could you give us a quick overview of what’s already known about how people’s understanding of science affects their attitudes towards scientific issues?
One of the goals of our work is to see if we can predict public attitudes more accurately if we measure knowledge of scientific facts and understanding of scientific processes separately.
What is the focus of your research and what does it tell us about citizen’s attitudes to science?
Our research focuses on the links between trust in science, scientific literacy, and acceptance of politically or ideologically controversial scientific consensus (e.g. that climate change is caused by humans). There has been extensive work on understanding and promoting scientific literacy, but some important aspects have not been emphasized enough. Similarly, trust in science has been measured in a variety of ways, but these measures have not covered some of what we believe to be the most important reasons to trust in the scientific enterprise.
What aspects of scientific literacy are there, and which are likely to affect people’s trust in science?
According to most experts, scientific literacy includes several components. Scientists usually measure scientific literacy by asking people about their knowledge of scientific content and their understanding of scientific methodology. At school the scientific method is usually taught at the level of individual scientists (e.g. developing a hypothesis, carrying out an experiment, etc.). Similarly, other surveys that have been developed over the years to measure scientific literacy are only focused on how science is carried out at the level of individual scientists. But science is not conducted only at the individual level. In fact the scientific community is paramount to the process of science and thus its credibility. The community level of the scientific enterprise includes:
- how scientists are trained
- how scientific work is published and disseminated
- how science is funded
- competition and cooperation between scientists
- social and institutional processes like peer review that check and filter scientific knowledge
We think that understanding these social aspects of science, helps people understand whether a scientific consensus exists on issues like climate change. We hypothesize that if people understand the scientific enterprise, this might increase their trust in science, and might even help counteract the influence of ideology. An individual scientist is not necessarily more trustworthy than any other individual person. So if you don’t understand that the scientific enterprise is designed to counter the biases and flaws of individuals through the work of the scientific community, you might not trust science. In contrast, if you understand how the scientific community reaches a consensus, on a theory like evolution, you might be more inclined to accept it. Similarly, if you understand that a large grant usually does not go directly into the pocket of a scientist, you might be less inclined to think scientists are making up climate change to get rich.
Can you give us an example of a scientific issue and explain how scientific literacy is related to how concerned people are about this issue?
We recently presented preliminary research at the AAAS Annual Meeting on a survey we are developing to measure public understanding of the social structure of science. In our pilot studies we have found that understanding the social structure of science is correlated with trust in science. In our study, people who understood the scientific enterprise better were more likely to be concerned about climate change.
In previous research, Dan Kahan and colleagues have shown that increased knowledge (of basic scientific facts, methods, and reasoning) of science has a polarizing effect on concern about climate change1. The more information people have, the more strongly they stick to their own views. Specifically, for conservative respondents, climate change concern decreases as scientific knowledge increases.
Given these results, we also separated our responses by political ideology. The good news is that in our study a better understanding of the social structure of science didn’t decrease Republican concern about climate change. Based on this initial finding we have launched larger studies to confirm these results. If understanding the community aspects of science can counteract the tendency for people to believe more strongly in their own views when presented with conflicting information, this could have useful implications for science communication and science education.
What advice would you give science communicators based on your research?
I think this research leads to important insights for both communicators and educators. At this point it is well known that for ideologically-controversial scientific issues, just presenting scientific evidence, numbers, and graphs doesn’t make a difference to people’s opinions. Studies have repeatedly shown that ideology overrides that information. Processes like cognitive bias and motivated reasoning work to keep ideological identity intact — meaning that people selectively trust evidence that fits their ideological views and will stick to their views even when presented with contradictory evidence.
What our research shows, however, is that helping people to understand the social structure of science may increase their trust in scientific findings and consensus. For communicators, this might mean focusing less on the exciting products of scientific research — “look, a new planet!” or “check out this weird critter!” — and more on the processes responsible for the steady accumulation of scientific knowledge (like peer-review or the community scrutiny of new research). This might help the audience understand why a publication, or more importantly, a group of publications, is trustworthy. This also might mean being careful to report on the entire body of work on a scientific subject, rather than just the latest single publication.
For science educators, this means spending significantly more time explaining how the scientific enterprise works at the community level, rather than just the scientific method as conducted by an individual scientist. The old “hypothesis, experiment, new hypothesis” dogma might need to be extended to include aspects like how science is funded, how scientists check each other’s work or how groups of scientists like the IPCC provide advice on issues like climate change.
Gastbeiträge spiegeln nicht zwangsläufig die Meinung unserer Redaktion wider.
kurz & knapp
- Im Allgemeinen gilt, wer besser über Wissenschaft informiert ist, bewertet sie positiver
- Für kontroverse Themen gilt die Bestätigungstendenz – wissenschaftliche Fakten können zur Polarisation der Meinungen führen
- Die Bestätigungstendenz ist schwächer, sobald die Wissenschafts-Community anstelle von Forschungsergebnissen zum Thema wird
- Wir machen die Wissenschafts-Community zum Thema, wenn wir über wissenschaftliches Vorgehen, Zusammenarbeit und die Finanzierung von Wissenschaft kommunizieren