Saturday 20 December 2008

More on OA, and the RAE results

This post is an extended non-response to Tim's comments to my previous post (December 2nd), which you may wish to look at before you go any further. It primarily focuses on a single benefit of OA - its impact on citations. The matter of financial gains and losses to institutions is beyond my capacity to address in great detail, but I will address it in my closing remarks.



First of all, a little background about how the quality of research is measured in UK universities. On Thursday (December 18th) the results of the Research Assessment Exercise (RAE) were announced (you can read plenty about it in the latest edition of the Time Higher Education. As a sidenote - the English department at Exeter, my university, is now established pretty firmly as one of the best, if not the best English department in the country for research).

You may know a little about the RAE but I'll just summarise it anyway. It's a peer-review system whose primary purpose is "to produce quality profiles for each submission of research activity made by institutions" (RAE homepage). These profiles, as THE notes, "show the percentage of research activity in each department judged to fall within each of four quality grades "in terms of originality, significance and rigour":

- 4* world-leading

- 3* internationally excellent ... "but which nonetheless falls short of the highest standards of excellence"

- 2* recognised internationally

- 1* recognised nationally

The English department at Exeter has 45% of its researchers working at the 4* level, the highest percentage in any such department in the country (excluding those who submitted only their best researchers to the exercise; Exeter as a whole submitted 95% of its academic staff); 90% of its research staff are working at an international level of excellence.

The RAE was last conducted in 2001, but back then the Higher Education Statistics Agency (Hesa) included "research intensity" as a metric in the results (the proportion of eligible researchers submitted by each institution, as opposed to simply the volume). Not so this time, meaning that the so-called HE research "league tables" (here's the Excel spreadsheet showing THE's league tables) favour those institutions which submitted only the researchers they believed would fall into the 3* and 4* categories exclusively, or, of course, the institutions specialising in a key area of research (which explains why The Institute of Cancer Research has been graded above Cambridge and Oxford). Ian Postlethwaite, the pro-vice-chancellor for research at the University of Leicester, has written more on this in The Guardian recently. Exeter comes 28th or so in most league tables based on this year's data, but had the research intensity been included as a factor (as it had in every other RAE), then we would have ranked about 13th.

I suppose that was a lot of unneccesary information but it's an interesting set of affairs nonetheless, and something I was keen to comment on. The point is that this time-consuming, cost-ineffective panel review system is being replaced from next year by what is known as the REF (Research Excellence Framework). According to Hefce's news release...

The REF will consist of a single unified framework for the funding and assessment of research across all subjects. It will make greater use of quantitative indicators in the assessment of research quality than the present system, while taking account of key differences between the different disciplines. Assessment will combine quantitative indicators - including bibliometric indicators wherever these are appropriate - and light-touch expert review. Which of these elements are employed, and the balance between them, will vary as appropriate to each subject.

Bibliometrics is, among other things, a measurement of citations - how often is your work being cited in other works? With the increased visibility offered to research papers through Open Access mandates, the benefits to bibliometric data cannot be overstressed. It is true that the effects of OA on citations has been questioned, but when one takes into account the role Institutional Repositories play in the tracking of "hits" to research material deposited in IR's such as the Exeter Research and Institutional Content (that's right, ERIC), it speaks volumes about the quality of data one can produce. Bibliometric indicators become entirely appropriate when these indicators can be tracked and when IR's across institutions and, indeed, disciplines can be quantified and compared - an effort made possible only where these systems are in place and a green OA mandate (note: PDF file) has been introduced.

This is just an initial discussion of the impact OA could have in increasing the value of the REF, an ongoing discussions still very much in its early stages. It also illustrates in a neat circular way why all this research should be freely available: funding councils dish out the cash for research conducted across the country (and elsewhere), who in turn receive their money from the government, who in turn get it from the taxpayer. This money is distributed based on the quality of research conducted at each institution, which will be measured from henceforth through the REF. The success of a key feature of that system - bibliometrics - relies, in my view, heavily upon the increased visibility of research papers and the ability to track how often they are viewed. Not only does OA offer significant advantages to research visibility (and this applies to established scholars and PhD students alike, whose dissertations can now be viewed online by anyone across the world as opposed to being lost in a dark corner of the university library), but it makes the results of tax-payer-funded research available to those who, however indirectly, paid for that research to be conducted in the first place.

Returning to Tim's primary argument, the financial impact of Open Access is a tricky one. Stevan Hanard, a promiment OA supporter based at the University of Southampton, states that:

The Green option allows the number of OA articles (not journals) to grow anarchically, article by article, rather than systematically, journal by journal. This allows TA journals to adjust gradually to any changes that might arise as the number of self-archived OA articles grows.

He goes on to argue that Green OA "allows both journals and institutions the time to prepare for a possible eventual transition (though not a necessary or certain one, as TA and OA might go on peacefully co-existing indefinitely) to Gold". In short, even if authors were to deposit their works to make them freely available, this doesn't necessarily constitute the end of academic subscription-based publishing. And, Hanard notes, even if it does:

...then TA journals can gradually adapt to it, first by cutting costs (by cutting out the features that are no longer essential) and then, perhaps (if it should ever become necessary) by converting to the OA journal cost-recovery model (with the institutions' annual windfall institutional TA subscription savings now available to cover their annual OA publication costs).

The availability of online material for newspapers does not signal the death of the newspaper industry, and I believe the same argument can applied to academic publishing. Where money can be made it will be, but financial gain should not negatively impact upon scholarship.




As a footnote, I'd like to wish everyone a merry Christmas and a happy new year! I have been developing some exciting ideas for my PhD thesis lately, and I'm hoping to share them in future posts - as much as I love talking about them, I don't want this blog to be dominated by OA and research software. See you in 2009!

No comments: