Authors
Affiliation

P. Stavropoulos

Athena Research Center

I. Grypari

Athena Research Center

N. Manola

Athena Research Center

H. Papageorgiou

Athena Research Center

Prevalence of replication studies

History

Version Revision date Revision Author
1.1 2023-05-11 Revisions P. Stavropoulos
1.0 2023-02-12 First draft I. Grypari, N. Manola, H. Papageorgiou, P. Stavropoulos

Description

Reproducibility and replicability are closely related concepts in scientific research. Reproducibility refers to the ability of independent researchers to obtain consistent results when using the same data and methods. It involves analyzing existing data in a similar manner to validate the findings. On the other hand, replicability focuses on the ability to repeat the entire experiment or study using new data collection and similar methods employed in the original study. This includes collecting new data to ensure the accuracy and robustness of the findings.

Reproducibility serves as a necessary but not sufficient condition for replicability. While reproducibility emphasizes obtaining consistent results from the same data and methods, replicability extends this by aiming to achieve the same results from new data and methods. In essence, replicability confirms the generalizability and applicability of the original research findings.

Replication studies play a crucial role in addressing the concepts of reproducibility and replicability. They are research studies that attempt to validate the findings of a prior piece of research by repeating the study using similar methods and circumstances. They can be exact or conceptual and aim to confirm the accuracy and broad applicability of the original research. Replication studies are important because they increase confidence in the findings of the original research and provide opportunities to test existing theories, hypotheses, or models.

Prevalence of replication studies is an indicator of the extent to which replication studies are being conducted in a particular field or scientific community. It aims to capture the adoption of Open Science practices related to reproducibility and transparency. Replication studies are important for validating and building upon existing research findings, and promoting Open Science practices can increase the reliability and credibility of scientific research.

This indicator can be used to assess the level of adoption of Open Science practices related to replication in a particular field, and to identify potential barriers or incentives for the adoption of such practices. Additionally, the prevalence of replication studies can be used to assess the impact of Open Science policies and initiatives aimed at promoting reproducibility and transparency in research.

Metrics

Number of replication studies

Number of replication studies is a metric that counts the number of studies that replicate previous research findings.

Limitations of this metric include the potential for biased or incomplete reporting of replication studies, as well as the possibility that some replication studies may not be identified or counted due to differences in methodology or interpretation.

Measurement.

To measure the number of replication studies, besides manually testing, a suitable automatic approach is to use text mining and machine learning techniques to find candidate studies that define themselves as replications or provide evidence of replication. Another approach is to search for a specific tag or field in relevant databases that indicate replication studies, such as the Open Science Framework’s Registered Reports database (https://osf.io/registries/discover?provider=OSF) or the Replication Wiki (http://replication.uni-goettingen.de/wiki/index.php/Main_Page). An alternative method would be to conduct a citation analysis, where relevant citations from the original publications are collected, referenced within the original text, and assessed using an automated tool to determine if they can be classified as replication studies.

Potential measurement problems and limitations include the possibility of incomplete or inaccurate reporting of replication studies, as well as differences in definitions and methodologies for identifying replication studies. Additionally, not all replication studies may be published in a way that makes them easily identifiable, which could lead to undercounting. There may also be variations in the level of adoption of Open Science practices related to replication across different fields or scientific communities, which could make it challenging to compare the prevalence of replication studies across different domains.

Datasources
Open Science Framework (OSF) - Registered Reports

The Open Science Framework (OSF) provides a platform for researchers to pre-register their studies, including replication studies, as Registered Reports. These reports undergo peer review prior to data collection, ensuring transparency and credibility. The OSF Registered Reports database can be accessed at https://osf.io/registries/discover?provider=OSF. The database can be queried using specific search terms to find replication

For the calculation of the metric, the database can be queried using specific search terms to find the relevant replication studies.

Limitations of this datasource include the reliance on researchers voluntarily registering their studies as Registered Reports, which may introduce selection bias. Additionally, not all replication studies may be registered or reported in this database, limiting its coverage.

Replication Wiki

The Replication Wiki (http://replication.uni-goettingen.de/wiki/index.php/Main_Page) is an online resource that catalogs a curated list of replication studies.

However, it is important to note that the Replication Wiki may not cover all replication studies, and its coverage may vary across different fields or scientific domains.

Existing methodology

To our knowledge there is no existing methodology to automatically find replication studies. Some suggested methods are the following:

  1. Text mining and machine learning algorithm to find candidate studies that define themselves as replications or provide evidence of replication.
  2. Search relevant fields or tags in specific databases to find publications that are replication studies.
  3. Perform citation analysis, where relevant citations from the original publications are collected, referenced within the original text, and assessed using an automated tool to determine if they can be classified as replication studies.

Number (%) of publications that are replication studies

Metric that measures the percentage of publications within a specific field or scientific community that are replication studies. This metric provides a more nuanced perspective on the prevalence of replication studies than simply counting the number of replication studies, as it considers the total number of publications within a specific scientific field or task.

Limitations of this metric include the potential for biased or incomplete reporting of replication studies, as well as the possibility that some replication studies may be misclassified as non-replication studies due to differences in methodology or interpretation. Additionally, this metric may be influenced by factors such as the availability of funding for replication studies and the incentives for researchers to conduct replication studies.

Reuse

Open Science Indicator Handbook © 2024 by PathOS is licensed under CC BY 4.0 (View License)

Citation

BibTeX citation:
@online{apartis2023,
  author = {Apartis, S. and Catalano, G. and Consiglio, G. and Costas,
    R. and Delugas, E. and Dulong de Rosnay, M. and Grypari, I. and
    Karasz, I. and Klebel, Thomas and Kormann, E. and Manola, N. and
    Papageorgiou, H. and Seminaroti, E. and Stavropoulos, P. and Stoy,
    L. and Traag, V.A. and van Leeuwen, T. and Venturini, T. and
    Vignetti, S. and Waltman, L. and Willemse, T.},
  title = {PathOS - {D2.1} - {D2.2} - {Open} {Science} {Indicator}
    {Handbook}},
  date = {2023},
  url = {https://handbook.pathos-project.eu/indicator_templates/quarto/1_open_science/prevalence_replication_studies.html},
  doi = {10.5281/zenodo.8305626},
  langid = {en}
}
For attribution, please cite this work as:
Apartis, S., G. Catalano, G. Consiglio, R. Costas, E. Delugas, M. Dulong de Rosnay, I. Grypari, et al. 2023. “PathOS - D2.1 - D2.2 - Open Science Indicator Handbook.” Zenodo. 2023. https://doi.org/10.5281/zenodo.8305626.