Luke Chafer discusses what goes into determining university rankings, highlighting the lack of consideration for student experience and wellbeing.
In the past year Glasgow University has failed sexual assault survivors, been accused of negligence, and been served with a damning report on racism. Yet, the publication of this year’s University rankings last month saw an impressive showing across the board. Internationally the University was ranked 73rd according to Quacquarelli Symonds (QS), 86th in the Times Higher Education (THE) world rankings, as well as being named THE Scottish university of the year. So, should we really pay any attention to what isn’t much more than a show of obsequiousness within academia?
Like many other naive 17-year-olds, I based my UCAS choices off these league tables without a true understanding of what they meant. International rankings, which only take into consideration about 2000 of the 18,000 institutions worldwide, are research oriented. A study by the Higher Education Policy Institute in 2016 found that 85% of the weighting in the QS and THE world rankings are based on research related measures but they both use different barometers to formulate their rankings.
One measure that props up the QS rankings is a survey of perceived reputation that is sent to over 1,000 academics – that’s like asking your pals how good your patter is. Differently, THE focuses heavily on citations, which means that their rankings have a heavy European skew. What neither of them do is take any consideration of student experience. If rankings weren’t primarily a marketing tool it would certainly make more sense to brand these tables as “world research” rankings, but that doesn’t quite have the same ring to it.
On the other hand, national rankings attempt to measure student experience. The Guardian’s league table is the only one that doesn’t use a single research output measure, instead focusing on nine factors from student satisfaction, teaching and expenditure per student. This year, Glasgow ranked 11th in their table, despite the bad press and overall discontent on campus. Similarly, The Sunday Times table looks at a host of factors from entry requirements to teaching excellence and research output – this is perhaps the most rounded of the methodologies.
Despite the more rounded approach they take, national rankings aren’t without their issues. One of the factors used in national rankings is proportion of good honours, but without an external regulator we have seen a substantial rise in these grades in the past 10 years; this makes it difficult to tell how well students are actually performing at an honours level. The danger with national rankings is that they follow a set of predetermined factors, meaning that university management focuses on ticking these boxes rather than on what’s best for the student body. Additionally, each year without fail, students at various universities boycott the National Student Survey, affecting the student satisfaction element of these tables, and Oxford and Cambridge rarely bother taking part.
Whilst rankings have their merit, they don’t truly reflect the experience of students on campus nor do they really try. Although it’s great to see our university climb again in this year’s tables, it would be advisable to look beyond what in essence is a marketing tool in the world of commercialised higher education.