After 12 months at wordpress.com, this blog has been integrated into my main site. Now, it lives at http://www.kai-arzheimer.com/blog/
With about 100 new respondents, yet another brilliant week for the Political Science Peer-Review Survey draws to a close. While the snowball is still rolling, and while we cannot know for certain because the survey is anonymous after all, we might soon reach a point of saturation: I have received a number of very friendly replies from people who tell me that they have already heard about the survey once (or twice) from someone else. The Netherlands in particular seem to be a hotspot of peer-review survey related activities. You could guess that much from the distribution of our respondents. While the US dominate the field (as they should), Switzerland and the Netherlands come an amazing 5th and 6th, accurately reflecting the standing of these countries as Social Science strongholds.
On Monday, the Political Science Peer-Review Survey had 506 respondents. Between Tuesday and Friday, we sent out 1,100 new invitations. Five days and many contacts with helpful colleagues later the number stands at 626. Feel free to join them.
On Monday, we started a new initiative to boost response to the Political Science Peer Review Survey. Thanks to some very industrious research students, we were able to identify about 21,000 individual authors who have published in Social Science Citation Index-covered Political Science Journals between 2000 and 2008. For about 8,000 of these, the SSCI lists their email addresses (that’s the EM field in the SSCI records), and so we started contacting them and asked them to participate in the survey. Obviously, some addresses are not longer valid because people have moved on to different places or have left academia altogether. Nonetheless, I was slightly surprised by the rather poor quality of the address data supplied by Thomson. In some cases, letters were missing whereas in other cases similar looking letters (e.g. ‘v’ and ‘y’) had been confused. This looks like either a weak OCR routine or an non-native and underpaid data typing slave has been used. Overall, we have contacted 962 people so far. About 200 of our messages have bounced, and we have 61 new responses to the survey (assuming that without the mailout, no one would have responded during these four days), which brings us to a new total of 238 responses
December 18 was the the day (or rather the night, as results were communicated at midnight) for UK academics: after years of preparation and second-guessing and months of waiting, the results of the 6th Research Assessment Exercise (RAE) were published. Every five years or so, the UK higher education funding councils examine the research output of the various “units of assessment” (i.e. departments) and publish a league table that is crucial for the allocation of “quality weighted research funding” (i.e. money) as well as for the reputation of a place. At the moment, the system is chiefly based on an evaluation of up to four publications per active researcher, which has lead to the creation of transfer market for scientists that ressembles professional football.
In every RAE since 1986, my institution has earned top grades. This time around, the marks are a bit more disaggregated, i.e. a percentage of 4*, 3* etc. work was published. But no matter which way you count and weight the results, we end up in the first place (tied with Sheffield but clearly ahead of Oxford and the LSE). Obviously, we are freaking happy.
While we are in the mood of surveying the peer-review process in political science, here is a quick link to the Political Science Journal Monitor. The site itself is blogspot blog converted into a makeshift forum, and activity is low. Nonetheless, this is an interesting an potentially relevant resource for many of us.