What’s really wrong with the NSS?

The Higher Education Policy Institute has kindly published an article I wrote on the interim plans for reform of the National Student Survey

The proposed changes are contained in the OfS’s Phase 1 Report of its NSS Review which was sparked by a somewhat untoward statement by the DfE last year that the NSS was responsible for “dumbing down standards“. No evidence for this claim was offered and it was exactly the opposite of… well, everything that they and predecessor governments had ever previously said about NSS’s role in enhancing the quality of higher education.

Indeed, the credibility afforded to the NSS previously meant that it was a key metric used in the TEF (the Teaching Excellence Framework, as it then was, now called ‘the Teaching Excellence and Student Outcomes Framework’).  Its weighing as part of TEF was downgraded, however, when student opposition to the exercise led to widespread boycotting of the survey.

The main reason the government has gone sour on the NSS though seems to me to be that it doesn’t endorse their political narrative about higher education – or, even if it does, the signal is too noisy, too nuancy. For example, NSS doesn’t say that the only good education is one that results in a job. It doesn’t say that our universities are all world-beating while at the same time managing also to say that they’re full of woke academics and snowflake students. And it fails woefully to confirm that traditional redbrick and Russell Group unis are better than jumped-up polys. 

Indeed, the university with the strongest record of performance in the NSS since its inception is – wait for it – the Open University. What should we make of that? There are multiple explanations for its NSS success, not least the fact that the survey is taken as students approach graduation and for OU students, that’s likely to have been a long, hard slog of many years, involving considerable commitment and sacrifice. Anyone who wasn’t going to give a good report will probably have fallen by the wayside by that point or at the very least will be convincing themselves that it was all worth it after all. Another explanation is that the OU does an amazing job for its students far exceeding their expectations and therefore yielding high satisfaction. 

What it doesn’t tell us is anything absolute. No wonder the government has lost interest in the NSS – it doesn’t tell them anything clearly or that’s politically helpful and even what it does tell them is not what they wanted to hear.

By chance, the DfE does happen to be right that the NSS needs reforming. It’s just it’s not for the reasons they imagine. As my HEPI article explains, the problems lie in (i) imagining that NSS can ever be about informing prospective students helpfully, (ii) the snapshot data dip process of a survey and (iii) the over-emphasis on satisfaction as a measure of quality when it is in fact a function of expectation compared to delivery. 

The reform needed is to shift to a longitudinal national survey of student engagement that tracks shifting patterns throughout a student’s time at university. Engagement has been shown to be an indicative precursor of positive learning outcomes. If you can show that a student has been effectively engaged throughout their studies, you’ve got a good indicator of effective education.

Satisfaction measures are poor proxies that will never tell you much and will always be too easily gamed or misinterpreted. They do not, however, dumb down anything that wasn’t dumb already. 

Please follow and like us:

Leave a Reply

Your email address will not be published. Required fields are marked *