Scientific literacy

Authors
Affiliations

T. Klebel

Know-Center

E. Kormann

Graz University of Technology

Version Revision date Revision Author
1.0 2024-12-06 General revisions Thomas Klebel
0.3 2024-12-03 Extending sections on measurement Eva Kormann
0.2 2024-02-27 Review comments Nicki Lisa Cole, Vincent Traag, Ioanna Grypari, Tommaso Venturi
0.1 2024-02-02 First draft Thomas Klebel

Description

Scientific literacy is a concept aimed at measuring an individual’s ability to engage with and understand scientific concepts and discussions. Despite the extensive literature on the subject, there is no common definition (DeBoer 2000; Laugksch 2000; Norris, Phillips, and Burns 2014; Roberts 2007). Science literacy is generally seen as a desirable goal, with diverging conceptions on what elements of it are desirable (DeBoer 2000; Laugksch 2000), and for which reasons (Norris, Phillips, and Burns 2014). An emergent topic in the literature is to what extent science education (and thus, scientific literacy) currently provides a sufficient response to the “age of misinformation” (Feinstein and Waddington 2020; Osborne and Pimentel 2023). Three perspectives on the concept of scientific literacy are instructive.

First, in a view proposed by Laugksch (2000) and supported by Roberts (2007), there are at least three stakeholder groups engaged with scientific literacy: (1) sociologists, (2) public opinion researchers, (3) science educators, including those involved in science communication. All three stakeholder groups approach the topic with their own goals and justifications for why scientific literacy matters. Consequently, they also employ different methods to measure the concept, ranging from in-depth interviews to representative population samples, to assessments of students’ competencies.

Second, Laugksch (2000) and Norris, Phillips, and Burns (2014) conceptualise scientific literacy to comprise of three different interpretations of what it means to be ‘literate’. The first refers to what one has learned – the specific knowledge gained. The second refers to being competent, having a certain capacity to engage with scientific contents. The third refers to how scientific literacy might enable one “to function minimally in society” (Laugksch 2000, 82), that is, being able to the live up to the expectations inherent to a certain role in society, such as being a competent citizen or consumer.

Third, Roberts (2007) provides a useful distinction between two “visions” of scientific literacy. Here the term “vision” is broader than a mere definition and represents an ideal type in the Weberian sense—an intentionally accentuated construct designed to highlight its core features while serving as a conceptual tool for analysis. Vision I in Roberts’ terms is concerned with literacy or knowledgeability within science, that is, it is targeted at an understanding of scientific products (publications, datasets, claims) and processes (Roberts 2007, 730). In contrast, vision II is targeted at situations where scientific knowledge can aid citizens in their daily lives: “At the extreme, this vision can be called literacy (again, read thorough knowledgeability) about science-related situations in which considerations other than science have an important place at the table” (Roberts 2007, 730). As an example, this vision is reflected in how the OECD defines scientific literacy in PISA (Roberts 2007, 766):

PISA defines scientific literacy as the ability to engage with science-related issues, and with the ideas of science, as a reflective citizen. PISA’s definition includes being able to explain phenomena scientifically, evaluate and design scientific enquiry, and interpret data and evidence scientifically. It emphasises the importance of being able to apply scientific knowledge in the context of real-life situations. (OECD 2017b, own emphasis)

A more recent summary of the existing conceptualisations of scientific literacy, including the discussion of a third “vision”, can be found in Valladares (2021).

Given the diverse perspectives and the lack of consensus on a definition, metrics to study scientific literacy should be chosen and evaluated against the specific goal of a particular study (Laugksch 2000, 88). Coppi, Fialho, and Cid (2023) provide an overview of existing measurement tools and their applications. Below, we highlight key examples of how scientific literacy is commonly assessed.

The Programme for International Student Assessment (PISA) Assessment and Analytical Framework defines scientific literacy as “the ability to engage with science-related issues, and with the ideas of science, as a reflective citizen” (OECD 2017a, 15), and includes a description on how it is measured. However, specific questions or results from the survey on scientific literacy are not publicly available.

The three competencies that PISA includes in scientific literacy are (1) explaining phenomena scientifically, (2) evaluating and designing scientific inquiry and (3) scientifically interpreting data and evidence, which all require knowledge (content, procedural, and epistemic). Competencies and knowledge, together with context and attitudes, make up the four interrelated aspects within the PISA 2015 scientific literacy assessment framework.

While the PISA instrument is not publicly available, the framework includes detailed descriptions of the proportions at which different competencies, types of knowledge, depths of knowledge and contexts should be covered by the assessment items. Exemplary items contain scenarios that require students for example to assess presented evidence, understand data and apply content knowledge, while responding to simple or complex multiple-choice questions or constructed response items. For the final scoring, students are placed on a seven-level proficiency scale based on the responses to individual items (OECD 2017a).

Besides the PISA instrument, there is a great variety of other instruments used to assess scientific literacy, commonly aligned with vision II and the PISA conceptualization. In general, the focus seems to have shifted from viewing scientific literacy as one single ability to defining it as a multidimensional construct. Most of the instruments’ contents focus on a specific domain or context (e.g., biology), employing various formats to test skills. The majority of instruments have been developed for use in (secondary school) students and have mainly been employed in that context (Coppi, Fialho, and Cid 2023; Opitz, Heene, and Fischer 2017).

A relatively recent and broadly used instrument is the Test of Scientific Literacy Skills (TOSLS), developed to be a comprehensive and psychometrically sound measure that can be employed on a large scale. The TOSLS includes 28 multiple-choice questions presented through realistic scenarios and requiring respondents to use skills that relate to understanding scientific methods and interpreting data (Gormally, Brickman, and Lutz 2012). The instrument is available as the supplement to Gormally, Brickman, and Lutz (2012).

References

Coppi, Marcelo, Isabel Fialho, and Marília Cid. 2023. “Scientific Literacy Assessment Instruments: A Systematic Literature Review.” Educação Em Revista 39: e37523. https://doi.org/10.1590/0102-4698237523-t.
DeBoer, George E. 2000. “Scientific Literacy: Another Look at Its Historical and Contemporary Meanings and Its Relationship to Science Education Reform.” Journal of Research in Science Teaching 37 (6): 582–601. https://doi.org/10.1002/1098-2736(200008)37:6<582::AID-TEA5>3.0.CO;2-L.
Feinstein, Noah Weeth, and David Isaac Waddington. 2020. “Individual Truth Judgments or Purposeful, Collective Sensemaking? Rethinking Science Educations Response to the Post-Truth Era.” Educational Psychologist 55 (3): 155–66. https://doi.org/10.1080/00461520.2020.1780130.
Gormally, Cara, Peggy Brickman, and Mary Lutz. 2012. “Developing a Test of Scientific Literacy Skills (TOSLS): Measuring Undergraduates Evaluation of Scientific Information and Arguments.” CBELife Sciences Education 11 (4): 364–77. https://doi.org/10.1187/cbe.12-03-0026.
Laugksch, Ruediger C. 2000. “Scientific Literacy: A Conceptual Overview.” Science Education 84 (1): 71–94. https://doi.org/10.1002/(SICI)1098-237X(200001)84:1<71::AID-SCE6>3.0.CO;2-C.
Norris, Stephen P., Linda M. Phillips, and David P. Burns. 2014. “Conceptions of Scientific Literacy: Identifying and Evaluating Their Programmatic Elements.” In, edited by Michael R. Matthews, 1317–44. Dordrecht: Springer Netherlands. https://link.springer.com/10.1007/978-94-007-7654-8_40.
OECD. 2017a. PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic and Financial Literacy and Collaborative Problem Solving, Revised Edition. PISA. OECD. https://doi.org/10.1787/9789264281820-en.
———. 2017b. “PISA for Development Brief 10 - How Does PISA for Development Measure Scientific Literacy?” https://www.oecd.org/pisa/pisa-for-development/10%20-%20How%20PISA-D%20measures%20science%20literacy_rev.pdf.
Opitz, Ansgar, Moritz Heene, and Frank Fischer. 2017. “Measuring Scientific Reasoning a Review of Test Instruments.” Educational Research and Evaluation 23 (3-4): 78–101. https://doi.org/10.1080/13803611.2017.1338586.
Osborne, Jonathan, and Daniel Pimentel. 2023. “Science Education in an Age of Misinformation.” Science Education 107 (3): 553–71. https://doi.org/10.1002/sce.21790.
Roberts, Douglas A. 2007. “Scientific Literacy/Science Literacy.” In, 729780. Routledge. https://api.taylorfrancis.com/content/chapters/edit/download?identifierName=doi&identifierValue=10.4324/9780203824696-29&type=chapterpdf.
Valladares, Liliana. 2021. “Scientific Literacy and Social Transformation.” Science & Education 30 (3): 557–87. https://doi.org/10.1007/s11191-021-00205-2.

Reuse

Open Science Impact Indicator Handbook © 2024 by PathOS is licensed under CC BY 4.0 (View License)

Citation

BibTeX citation:
@online{apartis2024,
  author = {Apartis, S. and Catalano, G. and Consiglio, G. and Costas,
    R. and Delugas, E. and Dulong de Rosnay, M. and Grypari, I. and
    Karasz, I. and Klebel, Thomas and Kormann, E. and Manola, N. and
    Papageorgiou, H. and Seminaroti, E. and Stavropoulos, P. and Stoy,
    L. and Traag, V.A. and van Leeuwen, T. and Venturini, T. and
    Vignetti, S. and Waltman, L. and Willemse, T.},
  title = {Open {Science} {Impact} {Indicator} {Handbook}},
  date = {2024},
  url = {https://handbook.pathos-project.eu/sections/3_societal_impact/scientific_literacy.html},
  doi = {10.5281/zenodo.14538442},
  langid = {en}
}
For attribution, please cite this work as:
Apartis, S., G. Catalano, G. Consiglio, R. Costas, E. Delugas, M. Dulong de Rosnay, I. Grypari, et al. 2024. “Open Science Impact Indicator Handbook.” Zenodo. 2024. https://doi.org/10.5281/zenodo.14538442.