Large decorative monkey puppet, clambering over a high rise building.
Photo by fito from Freerange Stock

I’ve spent all my adult life in some version of the higher education and research sector, first as a student before careers in this area.  I guess that probably makes me a prime candidate for being labelled, by the increasingly strident critics of higher education and public services, as a member of the lanyard classes.

My background though is perhaps a little different (though by no means exceptional) than that (pejorative) label suggests. 

fortune

First in family to go to university, having been born and spent all my life up to going to university growing up in a northern industrial town.

I was fortunate to have grown up in this particular town.  While huge numbers of such places were being decimated by industrial decline, my town had the good fortune to build the Royal Navy’s nuclear submarines so it avoided the worst ravages of deindustrialisation in the 1970s and 1980s.

Then, of course, the Berlin Wall fell, and the fabled Peace Dividend washed away the business model that had sustained the UK defence industry, including the economy and society of my home town. Fortune had shifted.

The consequence was wave after wave of redundancies from the start of the 1990s.  The vision of a rosy, non-defence-related future for those who used to work in the defence industries turned out to be as naïve and unrealistic as the sceptics suggested (which, yes, included me).

I never experienced this directly as I never worked in ‘the yard’, but many family members, friends and others I knew did.  Round after round of job losses that seemed never ending. Hugely consequential for those who lost their jobs; but also for those still employed who were left torn between being glad to still be working, while knowing that the game of ’employment Russian roulette’ would soon start again and for most their luck would only hold for so long.

the state we’re in

The parallels with UK higher education in the 2020s are so obvious as to not need making, but I’m going to labour them anyway.

Many universities have already seen repeated cycles of job losses as they seek to tackle the crumbling financial model for UK higher education, and some are still in the throes of rounds of severance and redundancy.

However, even for those universities in a lull between rounds of job cuts it’s very difficult to see a future without further reductions in staffing.  Not only do the real terms decline in the home undergraduate fee, the decline (for many universities) of international student fee income and failure to fully fund research continue to bite; but the current government is repeatedly taking decisions that exacerbate rather than ameliorate the financial position of the sector. And we know that the imminent (still!) white paper on education and skills will make little positive impact on this.

And of course the job losses in the sector can’t happen without consequences.  Of course for the staff who have lost their jobs.  Clearly for morale among those who are left.  But also inevitably for one of the old faithfuls of higher education analysis, the Staff Student Ratio (SSR) which is inexorably recently.

This was recently highlighted in a recent piece on WonkHE.  It has also manifested in the reports of those universities who shed posts over the last year, only to enter into 2025-26 needing to employ/re-employ staff in order to be able to deliver their programmes.  Essentially we’re seeing what fills the pregnant pause that Shitij Kapur left in his Triangle of Sadness pamphlet, after highlighting the lower SSRs in the UK compared to many other national higher education systems.

why does it matter?

The logic for caring about SSRs is nicely captured in its definition by one of the international league tables:

The Faculty-Student Ratio indicator is a measure of the number of academic staff that an institution has to teach its students.  The more academic staff resources are made available to students, for teaching, supervision, curriculum development, and pastoral support, the better the learning experience should be.

There has, though always been significant criticism of the value of SSR as a proxy for educational quality, and the academic experience of students.

the data

One area of criticism has been the longstanding, detailed and technical debate about how to define SSRs (strands of this can be followed in this post, this piece and this article again on WonkHE).  And linked to this we have recently seen HESA make changes to the underlying data, which this year led The Times and Sunday Times league table to exclude SSR from its calculations on the basis that HESA was no longer publishing ‘official’ SSRs.

That, of course, is what is done with data submitted to HESA by universities, but there’s always been another problem with SSRs; how that data is compiled prior to submission.

I’m not for a moment suggesting universities submit knowingly incorrect data.  However, HESA submissions necessarily involve judgment calls by their compilers within institutions, and in some places at some times there is a little (and sometimes more than a little) monkey business played with the staffing aspect of the HESA return to help improve SSRs.  As I say, I’m sure the letter of the rules aren’t broken.  The spirit and intent of the rules, well …

So two significant caveats about SSRs.  But wait, there’s more.

the facts on the ground

The logic quoted above for paying attention to SSRs is impeccable.  However, the relationship between SSR and the experience of students is much less clear than this suggests.  Because whatever the SSR is on paper, as if not more important are the choices that schools/departments/subjects make on how they deploy those staff – the balance of modes of delivery (large group/small group teaching), the size of actual classes etc.

These decisions are critical to the educational experience of students. The reasons such decisions are taken aren’t normally due to monkey business on the part of an institution (though where a university is seeking to free up more staff time to undertake research, it can approach this).  More usually it’s just down to the pragmatics of programme delivery.  But either way it does impact on how much SSRs can actually tell us about the educational experience of students.

brave new world

And I suspect that we’re about to see a new critique of the value of the SSR, stemming from the rise of AI.  While I’m an AI sceptic, there is no denying the significant impact it is having on higher education, and that this will only increase.

As universities more consciously and purposefully seek to leverage AI, it won’t only be in relation to how they run themselves.  The impact on our educational provision itself will be significant.  And as this happens, how long will it be before we start hearing the argument from universities forced by financial reality to reduce staffing levels, that to talk about SSRs is outmoded in the age of AI.  There has been a ‘paradigm shift’. That focusing SSRs in this new world is like talking about the Monks to Books-produced Ratio (MBR) as a valid measure of the publishing industry’s activity, following the invention of the printing press (monk-ey business perhaps?).

There’s something to such a narrative.  But there’s also an element of intentional framing about it, that runs the risk of seeking to distort, monkey around with, our perceptions of the importance of SSR as an indicator.

we create the world we live in

Is there a danger that the apparent financial inevitability of rising SSRs, the longstanding and well-known data issues and naïve (whether in good or bad faith) technological determinism and utopianism combine to make SSR as valuable in understanding higher education as the MBR was to an analysis of the sixteenth century publishing industry?

Hopefully not.

After all, even though they don’t really have the data to be able to do this in a meaningful or timely way at the moment, OfS’s current consultation on the future of TEF and quality assessment does include SSR in their proposed tool for the ongoing monitoring of risk quality.  So there is at least for now a recognition that, however flawed we know it to be, SSR is useful and valuable proxy (to be used alongside other relevant qualitative and quantitative data) for key aspects of educational experience.

But we also know the upward path the UK higher education is on with its SSR, and that this will continue for some time yet.  This will almost inevitably lead to pressure from some quarters to reconsider how we think about and how we value SSR as an indicator of the health of the higher education experience.  And some of this will be valid, but as a sector we need not to allow this to go too far.

Education is fundamentally a social, relational process. The number, as well as the quality, of educators relative to the number of students will always be of real importance.

Leave a comment

Previous:
Next: