
Swamped by rankings
Lately the odds have confronted me with what I thought, was a topic I had left behind: university rankings. Having written my PhD thesis on university rankings, you can imagine how fed up I ended with all this business. Still, the fascination they provoke and the stimulating discussions they lead to (usually criticizing their use) traps me from time to time. While I thought these relapses were anecdotal, funnily, they have concentrated in the last 15 days. First, came the announcement of the release of the 2019 edition of the Leiden Ranking. This year, it includes gender and Open Access indicators. Me being involved in the development of the latter. Then, just a couple of days ago, our paper on Mining university rankings was finally accepted for publication in Research Evaluation and we uploaded an OA version of the manuscript.
Anecdotes aside, I have been reflecting on how university rankings have evolved since 2014, when I stopped following them closely. What we show in the paper is not surprising, although evidence was missing. Namely, that university rankings, no matter the formula they use or the weights given to the considered variables, end up measuring number of publications and partly, normalized citation impact. And that in fact, by simply sorting universities by their H-Index, you get a very close image of what any university ranking looks like. This strong reliance on publication and citation indicators has led to strong criticisms on which I will not go now, but that can be easily resumed by stating that universities do more things other than producing publications. Collini’s book What are universities for? reflecting on universities’ mission, gives a good account on the impossibility of both, measuring institutions’ performance with a single indicator, and comparing them later with each other. So given the strength of such positions, by the end of my PhD I saw this topic of research as exhausted. Not much to report on it.
grounding politics in statistics is elitist, undemocratic and oblivious to people’s emotional investments in their community and nation
William Davies, How statistics lost their power – and why we should fear what comes next
But rankings have been able to respond, maybe not satisfyingly, but at least, respond to such criticisms. Their latest twist have been the creating of new league tables following specific goals. In that sense, the inclusion of these two new indicators by the Leiden Ranking, evidence that a more informative approach which highlights different aspect of universities’ performance, can lead to more thoughtful insights and stimulate discussions among university leaders. The THE Rankings now also include league tables based on the SDGs. And although, the methodology behind them raises important concerns, these movements towards diversity of views and measurements might lead to a more responsible use of metrics (sciento- or otherwise), and move the debate away from the use of numbers to how can numbers be used.