Scientific literacy
Description
Scientific literacy is a concept aimed at measuring an individual’s ability to engage with and understand scientific concepts and discussions. Despite the extensive literature on the subject, there is no common definition (DeBoer 2000; Laugksch 2000; Norris, Phillips, and Burns 2014; Roberts 2007). Science literacy is generally seen as a desirable goal, with diverging conceptions on what elements of it are desirable (DeBoer 2000; Laugksch 2000), and for which reasons (Norris, Phillips, and Burns 2014). An emergent topic in the literature is to what extent science education (and thus, scientific literacy) currently provides a sufficient response to the “age of misinformation” (Feinstein and Waddington 2020; Osborne and Pimentel 2023). Three perspectives on the concept of scientific literacy are instructive.
First, in a view proposed by Laugksch (2000) and supported by Roberts (2007), there are at least three stakeholder groups engaged with scientific literacy: (1) sociologists, (2) public opinion researchers, (3) science educators, including those involved in science communication. All three stakeholder groups approach the topic with their own goals and justifications for why scientific literacy matters. Consequently, they also employ different methods to measure the concept, ranging from in-depth interviews to representative population samples, to assessments of students’ competencies.
Second, Laugksch (2000) and Norris, Phillips, and Burns (2014) conceptualise scientific literacy to comprise of three different interpretations of what it means to be ‘literate’. The first refers to what one has learned – the specific knowledge gained. The second refers to being competent, having a certain capacity to engage with scientific contents. The third refers to how scientific literacy might enable one “to function minimally in society” (Laugksch 2000, 82), that is, being able to the live up to the expectations inherent to a certain role in society, such as being a competent citizen or consumer.
Third, Roberts (2007) provides a useful distinction between two “visions” of scientific literacy. Here the term “vision” is broader than a mere definition and represents an ideal type in the Weberian sense—an intentionally accentuated construct designed to highlight its core features while serving as a conceptual tool for analysis. Vision I in Roberts’ terms is concerned with literacy or knowledgeability within science, that is, it is targeted at an understanding of scientific products (publications, datasets, claims) and processes (Roberts 2007, 730). In contrast, vision II is targeted at situations where scientific knowledge can aid citizens in their daily lives: “At the extreme, this vision can be called literacy (again, read thorough knowledgeability) about science-related situations in which considerations other than science have an important place at the table” (Roberts 2007, 730). As an example, this vision is reflected in how the OECD defines scientific literacy in PISA (Roberts 2007, 766):
PISA defines scientific literacy as the ability to engage with science-related issues, and with the ideas of science, as a reflective citizen. PISA’s definition includes being able to explain phenomena scientifically, evaluate and design scientific enquiry, and interpret data and evidence scientifically. It emphasises the importance of being able to apply scientific knowledge in the context of real-life situations. (OECD 2017b, own emphasis)
A more recent summary of the existing conceptualisations of scientific literacy, including the discussion of a third “vision”, can be found in Valladares (2021).
Given the diverse perspectives and the lack of consensus on a definition, metrics to study scientific literacy should be chosen and evaluated against the specific goal of a particular study (Laugksch 2000, 88). Coppi, Fialho, and Cid (2023) provide an overview of existing measurement tools and their applications. Below, we highlight key examples of how scientific literacy is commonly assessed.
The Programme for International Student Assessment (PISA) Assessment and Analytical Framework defines scientific literacy as “the ability to engage with science-related issues, and with the ideas of science, as a reflective citizen” (OECD 2017a, 15), and includes a description on how it is measured. However, specific questions or results from the survey on scientific literacy are not publicly available.
The three competencies that PISA includes in scientific literacy are (1) explaining phenomena scientifically, (2) evaluating and designing scientific inquiry and (3) scientifically interpreting data and evidence, which all require knowledge (content, procedural, and epistemic). Competencies and knowledge, together with context and attitudes, make up the four interrelated aspects within the PISA 2015 scientific literacy assessment framework.
While the PISA instrument is not publicly available, the framework includes detailed descriptions of the proportions at which different competencies, types of knowledge, depths of knowledge and contexts should be covered by the assessment items. Exemplary items contain scenarios that require students for example to assess presented evidence, understand data and apply content knowledge, while responding to simple or complex multiple-choice questions or constructed response items. For the final scoring, students are placed on a seven-level proficiency scale based on the responses to individual items (OECD 2017a).
Besides the PISA instrument, there is a great variety of other instruments used to assess scientific literacy, commonly aligned with vision II and the PISA conceptualization. In general, the focus seems to have shifted from viewing scientific literacy as one single ability to defining it as a multidimensional construct. Most of the instruments’ contents focus on a specific domain or context (e.g., biology), employing various formats to test skills. The majority of instruments have been developed for use in (secondary school) students and have mainly been employed in that context (Coppi, Fialho, and Cid 2023; Opitz, Heene, and Fischer 2017).
A relatively recent and broadly used instrument is the Test of Scientific Literacy Skills (TOSLS), developed to be a comprehensive and psychometrically sound measure that can be employed on a large scale. The TOSLS includes 28 multiple-choice questions presented through realistic scenarios and requiring respondents to use skills that relate to understanding scientific methods and interpreting data (Gormally, Brickman, and Lutz 2012). The instrument is available as the supplement to Gormally, Brickman, and Lutz (2012).
References
Reuse
Citation
@online{apartis2024,
author = {Apartis, S. and Catalano, G. and Consiglio, G. and Costas,
R. and Delugas, E. and Dulong de Rosnay, M. and Grypari, I. and
Karasz, I. and Klebel, Thomas and Kormann, E. and Manola, N. and
Papageorgiou, H. and Seminaroti, E. and Stavropoulos, P. and Stoy,
L. and Traag, V.A. and van Leeuwen, T. and Venturini, T. and
Vignetti, S. and Waltman, L. and Willemse, T.},
title = {Open {Science} {Impact} {Indicator} {Handbook}},
date = {2024},
url = {https://handbook.pathos-project.eu/sections/3_societal_impact/scientific_literacy.html},
doi = {10.5281/zenodo.14538442},
langid = {en}
}