By Aviv J. Sharon, Jim Ryder, Jonathan Osborne, Esther Laslo & Hani SwirskiIndividuals need to be "science literate" to cope with many demands of everyday life, and much of their science literacy stems from what they learned at school. These are hardly new claims, neither in science education nor in science communication.
Still, the term "science literacy" might sound old-fashioned to some readers, and even evoke associations of a top-down, paternalistic relationship between scientists and publics. In contrast, in science education and other “learning orientation” disciplines, the term “science literacy” is still used to describe the set of knowledge and competencies needed to engage with science.
Perhaps it's time for some re-assessment? Arguably, scholars can help diverse individuals, publics and social groups apply scientific reasoning in everyday life, without being patronizing or treating people inequitably. People too might want to know about science and scientific concepts, especially when they evaluate putative scientific claims appearing in online and other media. These claims can address anything from the use of routine childhood vaccines to the latest fad diet, and appear in anything from a viral e-mail to a sponsored Facebook advertisement. While unreliable science has appeared in public communications of science and technology, long before the internet, the internet has made content cheap to produce and to distribute widely, and traditional gatekeepers, such as editors, are often absent from the process. This contemporary context has led to an abundance of highly accessible online content of varying quality.
What then do people need to know and be able to do to evaluate science-related information whether offline or online? We would like to suggest some rules of thumb or heuristics which could inform the education of individuals, publics and social groups, the design of educational activities and science communication projects, and support ‘scientific literacy’ in the online age.
What is the authority of the writer? Do they have expertise in the field? What is the nature of the institution that they come from?Science educators and science communicators should encourage media consumers to ask: Is the identity of the writer clear at all? If it is, a process of judgment about the writer as a source should then start. Would you rely without hesitation on a geologist claiming to have unique insights on wisdom teeth? How about a dentist's insights into volcanology?
Research shows that in many contexts, people are most readily willing to trust people who they perceive to have pertinent expertise. Similarly, the same applies to trust in institutions. Interestingly, it is not clear how much school science instruction improves children's pre-existing reasoning skills to make judgments about expertise. For instance, the epistemic importance of peer review within the scientific community is rarely taught in formal education. As Longino explains, scientific debates are necessary for creating reliable scientific knowledge.
With self-styled "experts" on science and health seemingly popping up everywhere, it is important to reflect on the question of how to make judgments about who is an expert. This is a complex issue, but we recommend this interview with sociologist of science Harry Collins in the Scientific American as a gentle introduction to the nature of expertise.
What is the authority of the venue/website/institution that is the source? Does the writer have potential ulterior motives for the piece?The same criteria that can be used to evaluate the trustworthiness of experts and institutions can also be used to evaluate online media, including blogs and Facebook pages. “Integrity” and “benevolence” are the key criteria here, and they refer to the writer's, or the source's, inclination to:
- follow professional standards;
- act with the recipient’s interests in mind, or, more generally, to act in the greater interests of society.
What evidence is presented to substantiate the argument/claims? Are there multiple pieces of evidence? Is there any evidence of peer review?Of course, it's also important to take a close look at what is being said and at the evidence for the claims put forward, rather than at the speaker. Fortunately, one doesn't necessarily need to be an expert in every scientific domain to do this. For example, Ryder suggests that it can be very helpful to understand the strengths and weaknesses of different study designs or to be familiar with the concept of peer review. These hallmarks of scholarship are often similar in different scientific disciplines. However, when evaluating scientific evidence, we must all be sure to do so with intellectual humility and recognize our own ignorance about the wide range of scientific methods and procedures. We must all avoid overestimating our own competence. Science educators and science communicators should emphasize these procedural and epistemic aspects of science when communicating with audiences, and explain how this knowledge has been derived, whether it fits with other findings, and how much confidence we might have in the findings.
To what extent does the title match the content? Is the heading sensationalist? Do the visuals support the text?How many times did you see a sensational headline in the news beginning with the words "Researchers claim…" and ending with an exclamation mark, e.g. "Researchers claim: sugar is a dangerous toxin!"? Should we believe these headlines?
Headlines can be confusing and scary. In fact, Freimuth, Greenberg, DeWitt and Romano analyzed over 3000 cancer stories published in U.S. newspaper articles and found that half of the headlines were fear-arousing (e.g., "No way to avoid cancer-causing agents"; "Tap water linked to cancer"). We suspect the situation may be similar or worse in online news outlets, due to advertising-driven business models. Website editors must "chase online traffic" to make ends meet, and they rely heavily on data from web analytics in this effort. Such fine-grained data about news consumption was simply not available before the internet, and reliance on such data is likely to incentivize the use of sensationalistic – as well as "clickbait" – headlines to drive more traffic to the site.
If the headline barely touches the subject presented in the text, then that should probably raise a red flag. An excellent educational resource for identifying sensationalism is Mind over Media by the Media Education Lab at the University of Rhode Island. Science educators and science communicators should further develop "scientific media education" resources) and integrate them into their work. We also recommend that practitioners in both professions discuss the active evaluation of online sources in particular since they are so pervasive.
Is the work part of a program of research? How does this contribute or build on previous lines of work?While there is no dispute among scientists that humans are responsible for global warming, a handful of studies that highlighted the role of solar activity in climate change caused a media frenzy. Similarly, a study which claimed that the measles, mumps, and rubella (MMR) vaccine caused autism continues to alarm parents through social media, although it has since been discredited. Although sometimes minority opinions are correct, we should be careful not to cast doubt on well-established bodies of existing knowledge needlessly – and these cases demonstrate this well. The truth is that most one-off studies turn out to be false. Science advances through making mistakes but it sometimes takes some time to realize that it was a mistake. For some discussion of one-off studies and the unreliable media coverage they often get, see John Oliver's skit about health risks from Last Week Tonight and Christie Aschwanden's fascinating article about nutritional science on FiveThirtyEight.
In sum, science communicators and science educators should encourage non-scientists to approach all putative scientific claims skeptically, crosscheck information and base claims on a large number of studies, or trust qualified experts. Crosschecking should pay attention to both the quantity and quality of the evidence. This is no easy task even for an expert, so non-experts should take extra care in making such judgments on their own.
Are comments permitted? Are there counterarguments considered or allowed?One of the hints which can help evaluation of online scientific information are reader comments. These can reveal some of the meaning readers ascribe to science issues, and can expand readers' interpretation of online information. Most comments are authentic, and voice an opinion, elaborate, or correct a perceived error. Readers may want to convince others and influence public opinion, and therefore a variety of ideas are expressed in reader comments, many of them expanding on the original content of the article.
On the one hand, journalists value reader comments and even use them as a source for follow-up stories. On the other hand, some may criticize their low standards of expression. In many sites reader comments tend to be signed with a pseudonym, and may contain low-quality content such as defamation, incitement, abuse and even racism and hate speech if they are not filtered by the editorial staff. Uncivil readers' comments contribute to the polarization of perceptions about an issue and can change readers’ interpretation of a news story. Consequently PopularScience.com, for example, decided to change their policy by shutting off their comment section entirely. More recent evidence suggests that if site owners moderate posts, and provide notices such as "this comment has been removed by site moderators," this could alleviate the effects of uncivil comments on perceptions of news bias.
However, readers should be aware of the comments' filtering process. Are comments moderated merely based on civility? Might the filtering process be biased?
What is my stance towards what is written? Am I likely to be biased in favor or not?Making informed decisions or setting individual opinion towards science-related issues can be hard. It requires scientific skills such as the ability to weigh claims and evidence by using your understanding of science and how it is developed. However, individuals' beliefs or stances regarding science-related issues can be influenced by personal values, cultural identity, worldviews, risk perceptions and political outlook. Kahan maintains that the question of “whose side are you on?” has a much greater impact in this context than “what do you know?”. Setting informed stances, then, requires awareness of the scientific evidence, but furthermore, of possible personal, cultural, and social influences through the personal interpretation of scientific information. It also requires a willingness to listen to the opposing views and evaluate why you think they might be wrong. The ability to do this is dependent on the epistemic virtue of open-mindedness, which is one of the goals science education and science communication should promote.
To conclude, we have offered seven rules of thumb for the evaluation of science-related information online, and based them on some ideas and literature from science education and science communication. We hope they prove useful for practitioners and scholars in both fields. We also hope they provoke further discussion in the comments section of this blog post and elsewhere – both online and offline.
This blog post is a product of PESO 2017 – Public Engagement with Science Online, an interdisciplinary, international research workshop of the Israel Science Foundation, exploring interactions between sciences, publics and social media. It took place at the Technion – Israel Institute of Technology from June 25th to June 28th, 2017.
Aviv J. Sharon is a Ph.D. student at the Faculty of Education in Technology and Science at the Technion – Israel Institute of Technology. His research interests lie in the interface between science education and science communication. More specifically, his work examines enactments of scientific literacy in authentic online environments, especially in the context of personal health. His work has appeared in Public Understanding of Science and PLOS One. He has also taught biology and biotechnology at a public high school in Haifa, Israel.
Jim Ryder is the Editor of the international research review journal Studies in Science Education, and Secretary of the European Science Education Research Association (ESERA). He is also a Fellow of the Institute of Physics. At the University of Leeds he is the Director of the Centre for Studies in Science and Mathematics Education (CSSME), and Leader of the Teaching and Learning Academic Group. His research examines attempts to develop a school science education that supports people’s engagement with science outside of formal schooling. This includes a focus on teaching and learning about the ‘nature of science’.
Jonathan Osborne is Kamalachari Professor of Science Education at Stanford University and the Chair of the OECD PISA Science Expert Group. His research interests include Classroom Dynamics, Curriculum and Instruction, Science Education, and Women in Science.
Esti Laslo is a faculty member in the Department of Medical Laboratory Sciences at the Zefat Academic College. Her Ph.D. (Science Education, Technion) concerned bioethics in the media and examined the scientific and ethical tools used by the public, addressing realistic controversial issues at the intersection between science and ethics. She has taught biology and training biology teachers for many years and is currently developing a high school microbiology textbook.
Hani Swirski is a Ph.D. student at the Faculty of Education in Technology and Science at the Technion – Israel Institute of Technology. Her research deals with students' interest in science. Using questions that students bring up in formal environments (e.g., science classrooms) and informal environments (e.g., exhibitions and ask-an-expert websites), she identified common interests in science across different groups of learners and studied their stability over time. In addition, she examined which resource of questions may be useful for teachers and decision makers in order to integrate the student voice into the science curriculum. Hani has taught science at an elementary school in northern Israel and trained teachers to integrate ICT tools in pedagogy.