The Universities

Why High University Rankings Might be Bad News

The recent and widely reported success of Australian universities in the Times Higher Education rankings follows similar success in other international rankings of universities, such as QS and Shanghai Jiao. We also have our own Excellence in Research Australia (ERA) conducted by the Australian Research Council (ARC), where the proportion of Australian university research rated at or above world standard has been rising. Australia’s Good Universities Guide, which is more focused on teaching and employment, paints a mostly rosy picture of the performance of our universities. There is not yet a ranking of university management—perhaps a gap which a sharp rankings entrepreneur has already spotted and will soon exploit.

No doubt deputy vice-chancellors in charge of research across Australia are swaggering around their offices thinking about whether to spend their performance bonus on a new Porsche or a BMW, enjoying the adulation from their large teams, and wondering how best to describe their wonderful success in their next application to move up the well-greased pole of university administration. Important matters indeed, but Australian taxpayers, academics and students might also be wondering what this recent Australian rankings success means.

Salvatore Babones: The Group of Eight’s Chinese Whispers

There is a growing literature on how rankings culture shapes universities for good or ill (for instance Ellen Hazelkorn’s book Rankings and the Reshaping of Higher Education and the report of the UK team led by James Wilsdon), but I would like to focus here on what the recent Australian ranking successes might indicate about the health of Australian research.

One possibility is that Australian DVCs Research and their teams have just got better at describing their institution’s research in terms currently favoured by the rankers, rather than there being any actual improvement in the quality of research conducted in Australian universities. There has certainly been a vast increase in the resources devoted to documenting and spinning the research conducted in our universities. In the lead-up to Australian ERA submission deadlines, one can observe corridors full of well-paid staff, some of whom have been poached from other universities, working day and night to package the information about publications and grants they have extracted from their academics into a winning submission to the ARC. Some of the international rankings organisations require elaborate submissions, and a similar resource-intensive flurry of activity can be observed before these deadlines. Much more than filling out the form is going on here, but it is hard to tell how much of the recent rankings success owes to better spin rather than better actual research.

Even if we discount all this activity in university research offices and accept there has been an underlying improvement in the quality and quantity of Australian research, there are still questions to be asked. We need to understand some of the routes to research ranking success. Here are a few that raise questions about the connection between rankings success, the quality and quantity of research, and value for Australia.

Research specialisation and poaching. Quite a few DVCs Research in our universities have realised that hiring highly productive and highly cited researchers generates a lot of rankings bang for the discretionary university dollar. So they have sought such researchers in Australia and overseas and offered them good deals. A problem though for many DVCs is that faculties and departments­ control hiring in universities, and that deans and heads of departments often have broader concerns than rankings. So the cunning DVC Research has to find a way around these broader concerns, such as money being made available to faculties for particular hires. Another strategy is to set up and fund a series of research centres which report to the DVC Research rather than faculty deans. This of course goes with taking research funding from faculties, hence the tension that often exists between DVCs Research and faculty deans. In some universities we have seen teaching and administrative loads rise dramatically and funding dry up so much in faculties that research becomes almost impossible for the core academics in the university, in contrast to the growing number of well-resourced research-only academics.

Some particularly cunning DVCs Research have worked out that you don’t even need to hire these highly productive and cited researcher stars. You just have to affiliate them with the university. Rules for affiliations to count vary between ranking organisations, but for ERA it is generally necessary to have a 40 per cent appointment. The rules are a bit rubbery because the research star does not have to spend this 40 per cent meaningfully engaged with the university, may continue to hold an appointment elsewhere, and there is room for negotiation around the salary they are being paid. However, there are easily exploitable loopholes where the publications of researchers count if the university is credited in the by-line of their publication.

In an extreme case an overseas star could be just emailing their CV and webpage photo to the DVC Research and adding a few words to their publication by-lines in return for the money, perhaps with a yearly Australian holiday including a brief change out of beach gear for a seminar at the research centre to which the star is nominally attached. The star may never meet any of the university’s core academics in their field, let alone get anywhere near students. Australian universities have also poached entire research centres in their quest for rankings success, with the centre carrying on much as before with no meaningful contact with the staff or students of the university with which it is now affiliated.

Prioritising pure over applied research. In many fields pure research has much higher status and tends to be more highly cited than applied research. This may not reflect its higher value or greater difficulty, but rather the fact that pure research is upstream of applied research. In other words, pure research is an input into and cited by applied research, not the reverse. So DVCs Research driven by rankings success will use all their financial and other influence to push their academics towards more cited pure research. Much applied research that has a large potential real-world impact doesn’t get done in a rankings-driven environment.

Prioritising scientific research over humanities research. Some fields will always generate more publications than others. This could be because they are journal-based fields like science and medicine rather than book-based fields like the humanities. Humanities scholarship also tends to be more time intensive, and single authorship is the norm in the humanities rather than multiple authorship as in the sciences.

The sciences and medicine require costly equipment and so generate more competitive research grants and philanthropic income than the humanities. Since publication volumes and research income are important in most of the rankings metrics, DVCs Research driven by rankings success will starve their humanities scholars and invest in the sciences and medicine. And so we have the absurd spectacle of humanities researchers thinking of more expensive ways of doing research so as to compete with the sciences in research income metrics.

Researching overseas topics. Australia is a small country compared to the US, China, UK and the European Union. This means research on Australian topics is relevant to fewer people, and for this reason less cited irrespective of its quality. Most of the major journals where rankings-driven DVCs push their academics to publish are located overseas, and not terribly interested in Australian issues.

So the pressure will be to work on issues that are relevant to the US, China, UK and the European Union. Of course, many issues of importance to other countries are also important for Australia, but we have often seen Australian topics dropped for topics of interest purely to overseas countries. In my area of economics, researchers work more and more with US data and focus on purely US policy questions in search of publications in top US journals. This pressure filters through to hiring practices, because someone with US background and networks is going to have better chances in the US journals than someone with similar Australian experience. There are Australian economics departments where you would struggle to find anyone with expertise or interest in Australian economic institutions or policy.

Putting all this together, a rise in Australian rankings could represent Australian taxpayers’ money being spent augmenting the salary of a US research star to work on topics of purely US interest, adding an Australian university affiliation to their publications with no meaningful contact with core staff or students of Australian universities. Or it could represent changing the university name on a sign outside a research institute located overseas (and perhaps putting a fluffy kangaroo in the window) to reflect the bill for the research now being paid by the Australian taxpayer rather than another country.

Australian taxpayers, academics and students need to look a bit harder before joining the celebration of Australian universities’ rankings success.

Paul Oslington is Professor of Economics and Theology at Alphacrucis College, the national college of the Australian Pentecostal Movement, which is on a path to university accreditation. Until Covid intervened he was 2020 Visiting Fellow at the Center of Theological Inquiry in Princeton, working on a monograph commissioned by Harvard University Press on the history of economic thinking in the Christian tradition.

0 comments
Post a comment

You must be logged in to post a comment.