A few thoughts on OA Monitoring and CRISs (I)

Thu, 29/03/2018 - 14:08 -- Pablo de Castro
Author: 
Pablo de Castro
                                                                                                                                                                                                     In the wake of the AT2OA workshop on Open Access monitoring to be imminently held in Vienna, the post looks into recent attempts to coordinate the various national-level initiatives that are taking place in the area and suggests some possible prerequisites for this international endeavour to be able to succeed. It also argues that a successful OA monitoring in the pioneering countries should pave the way for other ones to eventually follow for their own progress assessment needs.
                                                                                                                                                                                                     A European Council statement was issued in May 2016 aiming to achieve full Open Access to research outputs by 2020. This was hailed at the time as a major step forwards in the push to widen access to the results of publicly-funded research. Nearly two years later there's a generalised awareness of the difficulty to reach this political goal across the EU by the proposed deadline. This should however not stop the efforts to achieve further progress and to improve the way Open Access is being implemented – this 100% Open Access objective is clearly achievable in specific countries that will then to some extent provide a best practice approach.                                                                                                                                                                                                                                    One of the areas where more work needs to be done is the actual monitoring of the progress in Open Access implementation. This has been on the cards for some time now, since national roadmaps with specific milestones and deadlines for reaching this 100% Open Access started to be produced quite a long time before the European Council meeting itself was held. These national-level discussions have resulted in a number of initiatives to monitor Open Access that are being implemented in different countries. The Knowledge Exchange, that brings together stakeholders like the Jisc in the UK, the DFG in Germany, SURF in the Netherlands, DEFF in Denmark or CSC in Finland, have taken a particularly relevant role in the past couple of years in ensuring that the various national-level approaches to Open Access monitoring would have the opportunity to discuss the progress with each other at a number of workshops.                                                                                                                                                                                                      The forthcoming (Apr 9th in Vienna) AT2OA WS on Open Access monitoring provides an opportunity to catch up with recent developments in the field. Discussions in Vienna will build upon the Apr 2017 report “Knowledge Exchange consensus on monitoring Open Access publications and cost data” in which the Knowledge Exchange summarized the findings of the last workshop that was held on the issue in Copenhagen in Nov 2016. A year and a half after this event and almost a year since the release of this report, there should be some progress to discuss in the methodologies being used for the purpose and on the actual results in specific countries. On top of this, it's always useful to give it another go at the common challenges that the different initiatives are trying to tackle on their own.                                                                                                                                                                                                      An interesting pre-print has recently been released providing an update on where we stand in our path towards full Open Access. Rather revealingly, this work looks into the figures for publications in 2014. This raises one of the key issues that this challenging task of monitoring OA poses: embargo periods and how to account for embargoed Open Access. Looking at the degree of Open Access availability with some hindsight allows these embargo periods to (mostly) become irrelevant. It does however raise evident issues when we're trying to monitor the present open availability of research outputs. How to deal with this is still under discussion.                                                                                                                                                                                                      This pre-print also breaks down the openly available works in three main categories: Gold Open Access, Green Open Access and Freely Available (FA) content. This latter category, also commonly known as 'Black Open Access', is a fairly slippery one since it takes into account access from platforms that require previous registration – which contradicts the basic definition of Open Access. While including this FA category in the study clearly serves the purpose of highlighting the very significant progress towards full open availability of research outputs, it also raises methodological issues on which platforms should or should not be categorised as FA.                                                                                                                                                                                                      Finally, the pre-print uses GoogleScholar as a source for determining Open Access availability. This is again a very pragmatic choice that may actually produce far more accurate results than a (more frequent) measurement based on a combination of publicly-funded platforms (repositories) and publisher outlets. National-level initiatives for OA monitoring must however take the canonical approach to use the available infrastructure (funded as it is by the same stakeholders who are funding the monitoring initiatives) as a basis for their analysis. This means that the results, although formally more reliable, are bound to deliver a significant underestimation of the amount of content that is in fact openly available. Arguably, this is also the case however for the results offered by the pre-print, since platforms like Sci-Hub were understandably left out of the list of 'Black OA' sources.                                                                                                                                                                                                      The main point to make in the discussions on the different approaches followed by various national-level OA monitoring initiatives may well then be that this is clearly a supranational effort and that common measuring standards are needed to approach this very difficult task in a harmonised way. Same as the Jisc produced these ‘Principles for Offset Agreements’ that served to coordinate – albeit on a different thread – the international response to a swiftly evolving Open Access landscape, some level of international consensus also needs to be reached on the de-facto mechanisms and assumptions for OA measurement that will be applied across countries. ‘Standard’ may be too strong a term to apply to this area, but it would be pointless to have different national-level initiatives trying to measure the same phenomenon using different procedures and ‘initial conditions’ – to use the physics term. In this regard it's fair to see the current efforts as the infancy of the OA measurement discipline: an ongoing discussion that should eventually crystalize in the definition of widely-agreed OA measurement principles that should be applied across borders so that the results may be comparable. This is an interesting demand however in a field that has traditionally been extremely bottom-up in its implementation. This definition of OA measurement principles is conceptually very top-down even if it arose from comprehensive discussions within the OA community, and an agreement on its application is bound to be difficult.                                                                                                                                                                                                      All this said, there clearly are technical challenges and tasks required for the progress on OA monitoring that research information management systems at institutions – and especially national- or regional-level CRIS systems – may well be able to effectively tackle or support. Looking into this was the actual aim of this post, but given that the introductory reflections have taken so much space it’s probably better to have an independent second post specifically focused on technical considerations.