I was celebrating the “RAE”:http://submissions.rae.ac.uk/results/qualityProfile.aspx?id=60&type=uoa results last night with a couple of Bellhavens, so I didn’t immediately write up a story about them. And this morning I see that in the comments thread over on “Brian Leiter’s blog”:http://leiterreports.typepad.com/blog/2008/12/2008-research-a.html#comments there are a lot of different proposals for how to summarise those results. Here are some of the natural proposals.
First, we could just rank the departments by the percentage of research activity that is category 4, i.e. world-leading. The results are then as below. (I’m just listing top 10s, including ties, and rounding up ties.)
1. |
University College London |
2. |
University of St Andrews |
3. |
King’s College London |
3. |
University of Oxford |
3. |
Cambridge HPS |
3. |
University of Sheffield |
3. |
LSE |
8. |
University of Bristol |
8. |
University of Reading |
10. |
Cambridge Philosophy |
10. |
University of Stirling |
10. |
University of Nottingham |
10. |
Birkbeck College |
Or we could look at the departments with the most research activity that’s either in category 4 (world leading) or category 3 (internationally excellent). Since I regard myself as more an internationally excellent person than a world leading person, I like this kind of measure.
1. |
University College London |
1. |
University of St Andrews |
1. |
King’s College London |
1. |
University of Reading |
1. |
University of Essex |
6. |
University of Sheffield |
6. |
University of Stirling |
8. |
University of Oxford |
8. |
Cambridge HPS |
8. |
LSE |
8. |
University of Bristol |
8. |
Cambridge Philosophy |
8. |
University of Leeds |
8. |
University of Edinburgh |
8. |
Middlesex University |
8. |
Queen’s University Belfast |
But this doesn’t discriminate between the 4s and the 3s, and it doesn’t include the value that 2s (recognised internationally) or 1s (recognised nationally) add. So some people, e.g. “The Guardian”:http://www.guardian.co.uk/education/table/2008/dec/18/rae-2008-philosophy, have summarised the result as a GPA. That’s the sum, as n ranges from 1 to 4, of n times the percentage of work that’s category n. By that measure we get,
1. |
University College London |
1. |
University of St Andrews |
3. |
King’s College London |
3. |
University of Reading |
3. |
University of Sheffield |
6. |
University of Stirling |
6. |
University of Oxford |
6. |
Cambridge HPS |
6. |
LSE |
10. |
University of Essex |
10. |
University of Bristol |
But that seems to me to undersell the difference between 4s and 3s, which is more important for many purposes than the difference between 2s and 3s, or 1s and 2s, and so on. If you want to do have a ‘top-weighted’ GPA, the thing to do is for each department, to work out the sum, as n ranges from 1 to 4, of 2n times the percentage of research activity in category n. If you do that, the rankings are.
1. |
University College London |
2. |
University of St Andrews |
3. |
King’s College London |
4. |
University of Sheffield |
5. |
University of Reading |
6. |
University of Oxford |
6. |
Cambridge HPS |
6. |
LSE |
9. |
University of Bristol |
10. |
University of Stirling |
But many people have been complaining that these averaging methods discount the merits that large deep departments have. (They are also possible to ‘game’ by not submitting the work of all faculty members, but I don’t see much evidence of that happening in the report. If anyone does see evidence of it, I’d be interested to know.) So a lot of people have been advocating that we multiply the number of faculty (who had work submitted) by the numbers generated above. For instance, this is the ranking of number of faculty submitted times percentage of research activity in category 4.
1. |
University of Oxford |
2. |
Cambridge HPS |
3. |
University of St Andrews |
4. |
King’s College London |
5. |
University College London |
6. |
University of Sheffield |
7. |
University of Leeds |
8. |
University of Bristol |
9. |
Cambridge Philosophy |
10. |
LSE |
And this is the ranking of number of faculty submitted times percentage of research activity in either category 3 or 4. Here is where we start to see Leeds (prominent employer of St Andrews and Rutgers grads, I hasten to add) doing well.
1. |
University of Oxford |
2. |
Cambridge HPS |
3. |
University of Leeds |
4. |
King’s College London |
5. |
University of St Andrews |
6. |
University of Sheffield |
7. |
Cambridge Philosophy |
8. |
University College London |
9. |
University of Warwick |
10. |
University of Edinburgh |
Here is the table for GPA times faculty submitted. Again, Leeds does well. (And note that Cambridge HPS, though undoubtedly a great department, is in no small part a great history department. It’s not quite an apples-to-apples comparison with other philosophy programs.)
1. |
University of Oxford |
2. |
Cambridge HPS |
3. |
University of Leeds |
4. |
King’s College London |
5. |
University of St Andrews |
6. |
University of Warwick |
7. |
University of Sheffield |
8. |
Cambridge Philosophy |
9. |
University College London |
10. |
University of Edinburgh |
Finally, here is faculty submitted times the ‘top-weighted’ GPA I described above. It actually doesn’t change a great deal from the overall GPA.
1. |
University of Oxford |
2. |
Cambridge HPS |
3. |
University of Leeds |
4. |
King’s College London |
5. |
University of St Andrews |
6. |
University of Sheffield |
7. |
University College London |
8. |
Cambridge Philosophy |
9. |
University of Warwick |
10. |
University of Bristol |
There are a lot of different ways to measure how well departments did from the RAE then. It would be nice to have a meta-score, some way of tabulating these differing metrics. The simplest thing to do, and I think the most widely employed approach in cases like this, is something like a “Borda Count”:http://en.wikipedia.org/wiki/Borda_Count. We’ll give departments 1 point for each first place finish, 2 points for each second place, and so on.
Because it was easier to do things this way in Excel, I’ve ’rounded down’ ties. So if 3 departments finish tied for 3rd, they’ll each get 3 points, rather than the 4 they should probably get. But I don’t think this changes things substantially.
Compiling the results in this way produces what I like to call The One True RAE Ranking of Philosophy Programs. It is,
University of St Andrews |
24 points |
King’s College London |
26 points |
University of Oxford |
27 points |
Cambridge HPS |
31 points |
University College London |
33 points |
University of Sheffield |
41 points |
University of Leeds |
67 points |
Cambridge Philosophy |
73 points |
University of Bristol |
76 points |
LSE |
77 points |
University of Reading |
77 points |
As feels intuitively correct when eyeballing the numbers, St Andrews comes out on top.
The result is fairly resilient over changes in methodology. That is, if you sum the results using some incorrect method, rather than generating The One True RAE Ranking of Philosophy Programs, you’ll still often get St Andrews on top.
In particular, if you just look at research activity in category 4, do a ranking of departments by percentage in category 4, then do a ranking of departments by faculty size times percentage in category 4, then do a Borda count to combine the two rankings, you get UCL first, St Andrews second. If you do the same thing with categories 3 and 4, you get King’s first, St Andrews second. If you do the same thing with GPA, you get St Andrews first. And if you do the same thing with what I was calling ‘weighted’ GPA, you get St Andrews tied for first with King’s and Oxford. So I think the methodology is reasonably sound.
Obviously over the days, weeks, months and years ahead, you’ll read many people trying to claim that their department really comes out best from the RAE. But if you’ve read this far, you’ll know better. The One True RAE Ranking of Philosophy Programs has St Andrews first.
All of this leads to a natural question. What’s the right beer to celebrate Rutgers’ success with?