Prevalence of Open Science support and training

Authors
Affiliation

T. van Leeuwen

Leiden University

V.A. Traag

Leiden University

Version Revision date Revision Author
1.1 2024-12-12 Integrated support and training V.A. Traag
1.0 2023-07-12 Initial draft T. van Leeuwen

Description

Open Science support and training is mostly organized at the institutional and even faculty level. This is often organized by the university or faculty libraries when it comes to scholarly publishing, that is, Open Access publishing. In addition, there may be national Open Science initiative that might provide Open Science training facilities. This might involve national research centres or institutions but could also be separately organised initiatives aimed specifically at Open Science. The support covers mostly things such as suggesting journals and helping with license details. When it relates to research data, support is nowadays organized in many institutions by data stewards. They take care of data management in general and consider issues related to for example FAIR principles.

The creation of metrics should take into consideration what elements of Open Science practices are being selected for support and training. Given this potential diversity, and the absence of having generic or systematic data sources at hand for this aspect of Open Science, the data need to be collected by qualitative means, that is going through websites of institutions and/or faculties to identify Open Science support and training facilities.

Metrics

Number/% of institutions that offer OS support and training

Open Science support and training might be often provided at the university level, or within the university at the faculty or even departmental level. This metric provides an institutional view of the extent to which Open Science support is offered.

Measurement

This metric can be constructed by searching through university and faculty level websites to identify the presence of Open Science support and training facilities, and when present identify what kind of Open Science activity is being supported.

Potential problems and limitations are:

  • This is a time-consuming effort, as one has to go through many websites, in particular when one is dealing with a large country, with many institutions, not necessarily limited to universities, but for example in the case of Germany, many publicly funded research institutions under the flag of Max Planck, Leibniz, Helmholtz, etc.
  • The assessment of this indicator also requires mastering the language to the country under investigation as the general principles of Open Access can be expressed by different notions in different linguistic context.
  • This kind of information might not be present on the website (yet), and as such create an underestimation of the actual situation.
  • How does one compare the breadth of the Open Science support and training, here one only aims at the presence, but the range of Open Science practices being targeted should also be considered.

There are presently no generic and systematic data sources that contain such information.

Automation of this metric is currently not possible.

Number/% of national OS support and training initiatives.

In some cases, there might be national Open Science initiatives or other national institutions that provide Open Science support or training initiatives. For instance, this could cover information around Open Access, or support for opening up data. This metric describes the extent to which there are national initiatives for Open Science support and training.

Measurement

This metric can be constructed by searching through policy initiatives that support Open Science practices on the national level.

Potential problems and limitations are:

  • This is a time-consuming effort, as one has to go through many websites, in particular with respect to the definition of ‘national’. In some countries, governance of science is not only a national issue, but might in the case of a federal government, also have to deal with federal initiative regarding the support of Open Science practices.
  • A similar issue relates to the presence of funders, and their support of Open Science practices, these are also considered national research funding agencies, and how do these relate to the above mentioned issue of the ‘national’ level.
  • How does one compare the breadth of the Open Science support, here one only aims at the presence, but the range of facilities to support Open Science is here not yet included.
  • Language can be a problem here as well.

There are presently no generic and systematic data sources that contain such information.

Automation of this metric is currently not possible.

Breadth of Open Science support and training

Open Science support and training can be broader or narrower. This metric provides an overview of the breadth of Open Science support and training, that is, to which extent it covers the diverse types of Open Science support and training, such as publishing, research data, Open Code, peer review, pre-registration and registered reports.

Measurement

Based on data that might be collected for the first two metrics, we might be able to construct a third metric. That is, when collecting information about training facilities, we could collect additional information. This metric could be constructed by searching through any relevant training facilities, either at the institutional or national level, as explained above.

Potential problems and limitations are:

  • This is a time-consuming effort, as one has to go through many websites, on both overall institution level, as well as on faculty level.
  • One has to decide upon a common denominator, the support for Open Science practices that are frequent across the system, but how does one deal with discipline specific support facilities that are not common?
  • How does one include the various stages of development regarding Open Science across faculties into this metric? Some scholarly disciplines do have a different perspective on Open Science practices, e.g., regarding Open Access publishing of books in the humanities, or regarding qualitative research data in some parts of the social sciences, e.g., anthropology).

There are presently no generic and systematic data sources that contain such information.

Automation is currently not possible.

Duration of Open Science training (hour/day/week).

By providing an overview of the duration of Open Science training, we may capture the intensity of Open Science training. Having just a few hours of training, or more extensive training of course makes a difference with regards to the amount of knowledge that can be transferred.

Measurement

This metric can be constructed in a similar manner as the previous metric, namely by looking at websites for what is offered as Open Science practice training facilities and collect the duration of such training facilities.

Potential problems and limitations are:

  • This is a time-consuming effort, as one has to go through many websites, on both overall institution level, as well as on faculty level.
  • Here one has to work with what is offered as Open Science training facilities, one does not have any idea of what is being ‘consumed’ from the offered training facilities?
  • An issue that popped up previously, one has to be aware of the fact that perhaps not all of the Open Science support facilities are (yet) online visible on websites.

There are presently no generic and systematic data sources that contain such information.

Automation is currently not possible.

Reuse

Open Science Impact Indicator Handbook © 2024 by PathOS is licensed under CC BY 4.0 (View License)

Citation

BibTeX citation:
@online{apartis2024,
  author = {Apartis, S. and Catalano, G. and Consiglio, G. and Costas,
    R. and Delugas, E. and Dulong de Rosnay, M. and Grypari, I. and
    Karasz, I. and Klebel, Thomas and Kormann, E. and Manola, N. and
    Papageorgiou, H. and Seminaroti, E. and Stavropoulos, P. and Stoy,
    L. and Traag, V.A. and van Leeuwen, T. and Venturini, T. and
    Vignetti, S. and Waltman, L. and Willemse, T.},
  title = {Open {Science} {Impact} {Indicator} {Handbook}},
  date = {2024},
  url = {https://handbook.pathos-project.eu/sections/1_open_science/prevalence_OS_training_support.html},
  doi = {10.5281/zenodo.14538442},
  langid = {en}
}
For attribution, please cite this work as:
Apartis, S., G. Catalano, G. Consiglio, R. Costas, E. Delugas, M. Dulong de Rosnay, I. Grypari, et al. 2024. “Open Science Impact Indicator Handbook.” Zenodo. 2024. https://doi.org/10.5281/zenodo.14538442.