Kaputnik is a conjunction of the word ‘kaput’(meaning broken, damaged, destroyed) and ‘Sputnik’ (Russian robotic spacecraft).  Meaning is obvious: something which hasn’t come up to expectations.

Urban Dictionary

I must admit my heart sank a little back in August when I read John Blake’s speech to the national SU membership conference on OfS’s work to develop and improve its approach to student interest.

Not because of the intent of the speech; this is an area of OfS’s work where improvement is clearly needed. Not because of the content of Blake’s speech; I think the insight/input/information framing of student engagement is potentially really useful.

No, the bit that made my heart sink came towards the end, when Blake said:

As part of our focus on information, we will review the use students make of the Discover Uni website and the National Student Survey, to ensure we have the right tools to put relevant info in front of students.

It wasn’t, and still isn’t, clear what this means in terms of NSS.  Hopefully it’s nothing more than a look at how NSS data is disseminated and presented to students.  Hopefully it’s not another revisiting of the Survey itself based on a view that NSS isn’t the right tool.  I do fear it might be the latter, as it often feels that for some parts of the sector NSS is deemed to be kaputnik.

wrong from the start

Since its inception in 2005 there have always been those opposed to the NSS on principle; that group has probably increased with the growth of the literature questioning the value of student evaluations of teaching as a whole.  There were also some who didn’t question the value of student evaluation of teaching, but who were savage about the form NSS took.

That (plus a certain amount of self-interest) may explain why when TEF was being developed back in 2016 we saw some universities (and indeed Parliament) express concerns about including NSS data in TEF’s core metrics despite the fact that it was the only element of the core metrics that gave any element of student voice.

Despite such opposition and concerns, NSS became fixture in the sector. And of course a survey tool that was first run in 2005 was bound to reach the point where it would need review and revision.

remake/remodel

The review of the NSS undertaken by HEFCE between 2014 and 2016, that resulted in important and measured changes to NSS from 2017, was both well-intentioned and helpful.  Albeit imperfectly, it shifted the Survey closer towards an emphasis on student engagement – not least with its addition of new sections on learning opportunities, learning community and student voice.

The review that led to further changes to the NSS in 2023 certainly wasn’t well-intentioned.

Initiated by the Department for Education in autumn 2020, it called for a ‘radical, root and branch review of the NSS’.  Essentially this was a direct attack on NSS. It seemed to be informed by little more than the exchanging of ill-informed prejudices and conjectures in the darker corners of then in-vogue think tanks and some senior common rooms, with the aim of torpedoing NSS.

This was implicit at the time.  The then Special Adviser to the Secretary of State for Education has recently been explicit that this was indeed an attempt at ‘abolishing the National Student Survey’.

holding the line

This was one of the rare occasions in recent years where OfS and (enough of) the sector were united in pushing back against an attempt to politicise higher education regulation, so the NSS survived.

But the cost of pushing back was a series of changes.  While some of these were positive, many of the changes made the NSS worse – e.g. removing the question on community; adding a blatantly politicised question on freedom of speech; including a question on mental wellbeing services which was well wide of the mark.

(One of the changes that caused most pushback from sector – removing for English HEPs the overall satisfaction question – was something I personally couldn’t get excited about.  I didn’t believe in one-word OfSTED judgments for schools; I don’t believe in one-word overall TEF judgments; I’m not going to support one overall NSS question claiming to summarise something as complex as a student’s experience in higher education).

where now?

As I say, I hope I’m over-interpreting John Blake’s comment and we’re not on a cusp of another review of NSS.  My fear that we are, is partly because the last time OfS did this I think it made NSS worse.  It’s also because I worry that it will provide another opportunity for longstanding opponents of NSS to launch another attack.

Perhaps I shouldn’t have that fear, but abolishing NSS still has currency in some higher education circles.  For example, when this year’s results were released that day’s 8AM Playbook email from Research Professional was dedicated to a lengthy attack on the Survey, asking ‘is it time to knock it on the head? What good is it doing?’ and dismissing it as ‘part of 20 years of bureaucracy’.  In this view NSS is, essentially, kaputnik.

accept the upside; manage the downside

I couldn’t disagree more.  While NSS has many faults, it still gives important insights into the views of our students on their educational experience.  Our commitment to working in partnership with our students as members of the same academic community, means that we must reflect on and engage with these results.

Yes, NSS is a partial and limited.  Of course there are traps in considering NSS data: e.g. needing to take account of statistical uncertainty around results; avoiding the trap of seeing statistically insignificant differences or changes in scores as important; kidding ourselves that there are quick fixes, when raising NSS scores is typically the work of multiple years.

But these issues can all be managed.  Combining longitudinal analysis, comparator analysis and looking at performance relative to benchmark.  Setting NSS results alongside a wide range of other evidence (qualitative as well as quantitative).  And most importantly filtering all of this evidence through the expertise of our academic communities – both staff and students.  And as we (continue to) do these things, we’ll benefit from the insights we can gain from NSS.

Rather than further attack and/or review, what NSS needs is a period of stability accepting it for what it is – limited, but useful.  Allow the sector to continue to work with it as it is rather than revise it again (one of the other downsides of the 2023 changes was the break in the longitudinal data, which was often the most powerful tool for NSS analysis).

And if OfS does need to look at NSS again, limit that to issues of presentation and dissemination.  Oh, and finally get around to addressing the glaring lack of a taught postgraduate equivalent.

One response to “kaputnik”

  1. time on my hands – left to my own devices – occasional thoughts on higher education Avatar
    time on my hands – left to my own devices – occasional thoughts on higher education

    […] on the sort of (but tangentially) related issue of student evaluations of teaching – both the NSS, and internal […]

    Like

Leave a comment