Peer Review Blog

Sunday, January 15, 2006

Journal peer review for blog posts

Tony Hirst of the OUseful Info blog has a great idea for academic journals - rather than standing aloof from the world of blogging, accept that many blog posts do have value and identify the better ones by means of peer review.

Tony's scheme would work as follows: People submit links to blog posts (or to other stories) to the journal's site. These then get reviewed by the journal's reviewers who make very simple yes-no type judgements, and depending on the number of votes a link gets it floats to the top or sinks into obscurity.

The idea is basically like having lots of mini-Digg's for academia, with the difference that not just anybody would be able to vote for or against a link, but only people designated as peer reviewers. As Tony points out, for this to work a journal would need to have many reviewers who are also active web users. What's nice about it though, is that reviewers won't be expected to spend a lot of time writing reviewers' reports and all that stuff - just a quick read and a simple decision - so it should be much easier to find cooperative reviewers than for conventional academic review.

I sometimes wonder if this model (quick review by many peers) might not sometimes also be useful for conventional academic publication, which currently relies too heavily on slow review by a few peers. There is also a third form of review already in use of course - slow review by many peers, which is essentially what happens when academic publications get cited by others in the years or decades following publication. This last form of review is in the long run perhaps the most useful of all, but is unfortunately currently still very much hampered by commercial indexing and abstracting services that are more interested in making money from citation data than in facilitating the tracking of academic conversations over time.

I look forward to the day when all academic content is freely available online and when a publication's list of sources cited is routinely also accompanied by a "this article has been cited by" list.

Wikipedia for medicine

Dean Giustini considers the possibility that a wikipedia for medicine would be better than current forms of peer review.

Saturday, January 07, 2006

Transforming the mechanisms and the purposes of peer review

The system of peer review as currently practiced is premised on a publishing economy of scarcity - only a certain number of books and journals can be economically sustained, and peer review is therefore necessary as one mechanism to ensure that only the best work is published. With the advent of electronic publishing, says Kathleen Fitzpatrick, there is "a vast transformation in both the mechanisms and the purposes of peer-review" -
"What if peer-review took place not prior to publication but on texts that have already been made public? What if that peer-review happened not anonymously, in back-channel communications with individuals other than a text’s author, but in the open, in direct communication between reader and author? Technologies ranging from commenting to, as John Holbo suggested in a recent post on The Valve, a more elaborated P2P system, could be made to serve many of the purposes that current peer-review systems serve (most importantly for institutional purposes, the separating of wheat and chaff), but would shift the process of peer-review from one that determines whether a manuscript should be published to one that determines how it should be received."
Katleen makes several other interesting points in her post. Also have a look at the comments.


Tuesday, January 03, 2006

Publishers reject booker prize winners

Not quite peer review, but an example of ineffectual reviewing nevertheless: It seems the UK Sunday Times sent manuscripts of the opening chapters of V.S. Naipaul’s In a Free State and a novel by Stanley Middleton to 20 publishers and agents. Both authors are Booker prize winners and Naipaul also won the Nobel prize for literature. None of the publishers or agents recognized the books and of the 21 replies received, all but one were rejections. This sort of thing is good fun and demonstrates the subjectivity of review, but it also highlights another important issue. To quote the Sunday Times article by Jonathan Calvert and Will Iredale:

"Many of the agencies find it hard to cope with the volume of submissions. One said last week that she receives up to 50 manuscripts a day, but takes on a maximum of only six new writers a year."

This is the real sticking point for most forms of expert review - too few experts and too much material to be reviewed.