Pages

31.5.08

[rae, metrics & open source]

Maintaining an academic career means paying close attention to your publishing record and its effect on the RAE. I'm not up on the metrics and specific weighting of kinds of publications and how that might differ across disciplines but I've just come across this interesting paper: "Open Access Scientometrics and the UK Research Assessment Exercise" by Stevan Harnad. In this article Harnad gives us an idea of how metrics and open source might work as an alternative to the usual "academic bean-counting of publications for performance evaluation and funding."


"Open Access. Until now, the reference metadata and cited references of the top 25% of the c. 24,000 peerreviewed journals published worldwide, across disciplines and languages, have been systematically fed (by the journal publishers) to the Institutite for Scientific Information (ISI), to be extracted and stored. But soon this is will change. It has been discovered (belatedly) that the Web makes it possible to make the full-text (not just the reference metadata and cited reference) of every single one of the 2.5 million articles published annually in those 24,000 journals (not just the top 25%) freely accessible online to all users (not just those that can afford
paid access to the journals and the ISI dtabase).

[...]

Lawrence (one of the co-inventors of Citeseer) published a study in Nature
in 2001, showing that articles that were made freely available on the Web were cited more than twice as much as those that were not ; yet most researchers still did not rush to self-archive. The finding of an OA citation impact advantage was soon extended beyond computer science, first to physics (Harnad & Brody 2004), and then also to all 10 of the biological, social science, and humanities disciplines so far tested (Hajjem et al 2005) ; yet the worldwide spontaneous self-archiving rate continued to hover around 15%.
If researchers themselves were not very heedful of the benefits of OA, however, their institutions and research funders – co-beneficiaries of their research impact – were: To my knowledge, the department of Electronics and Computer Science (ECS) at University of Southampton was the first to mandate self-archiving for all departmental research articles published: These had to be deposited in the department’s own Institutional Repository (IR) (upgraded using the first free, open source software for creating OA IRs, likewise created at
Southampton and now widely used worldwide)."

Interesting...

As Harnad says, the RAE is "a very cumbersome, time-consuming and expensive undertaking, for the researchers as well as the assessors" so we should really be looking into other possibilities.

"The data-mining potential of an OA corpus is enormous, not just for research evaluation by performance assessors, but for search and navigation by reseacher-users, students, and even the general public."


I wonder how this kind of OS metrics might fit in with the new RAE:

  • 2008 will mark the final appearance of traditional peer review systems for the UK research assessment exercise (RAE)
  • The UK government has announced plans to use a metrics system to assess research quality and guide funding
  • A metrics system could fit well to chemistry but some worry that an element of peer review will need to be retained for areas such as theoretical chemistry




No comments:

Post a Comment