By Athanasia Salamoura and Giannis Tsakonas (more about the contributors at the bottom of the page)

What is it about?

Monitoring systems are essential for tracking the progress in Open Access (OA) and particularly the goal of transitioning from paywalled to Open Access publications in many European countries. In a paper that will appear on UKSG Insights, we express our opinion about the challenges faced by monitoring dashboards in providing a complete view of the OA status, ensuring accuracy in measuring OA production, and achieving efficiency in the entire process. We analyzed the characteristics of various monitoring systems from European countries, including the sources of data, formats, visualization methods, update frequencies, granularity, and types of access recorded.

This is a companion site to the article “On the challenges of open access monitoring” that appears on UKSG Insights, https://doi.org/10.1629/uksg.641. We intent to keep this recording and comparison of OA monitoring initiatives updated.

Tables

We collected data from ten European countries, as well as cases of five cross-national ones to understand the scope of these initiatives and where key differences are attributed. For each system, we identified their characteristics as follows:

  1. Source: the origin of the data that they are analyzing and presenting.
  2. Format: the nature of the systems, which can be either dynamic or static.
  3. Graphs & Visuals: the visualization characteristics of these dashboards, which often take the form of tabular data and/or graphic representations, such as bar and line charts, pie and doughnut charts, etc.
  4. Update frequency: the time interval between the updates of information.
  5. Granularity: the level they address, such as a nation-wide, a region-wide, and/or an institutional viewpoint of this information.
  6. Types of access: the types of open access that they record and report on.

Table 1 offers the comparison between the initiatives and provides insights into how various countries are actively engaged in monitoring and promoting OA within their academic and research communities. These systems are operated by institutions and agencies of varied scope and responsibilities, including library consortia that want to monitor the progress of their transformative agreements, research funding organizations that wish to check the compliance of their institutions with their policies, and so on.

National Dashboards


Table 2 offers a similar comparison between the cross-national initiatives.