after the event

Image by pch.vector on Freepik; person and robot with computers sitting together in workplace.

It’s very tempting to look at the current storms in UK higher education, and focus on their specificities and peculiarities.  A few moments pause brings to mind that, as always, significant similarities sit alongside these differences.

The challenges we face as a sector, such as increasing demands on our resources at a time of reducing real income, are common in many parts of the UK’s public sphere.  And a recent Harvard Business Review article shows that this isn’t just happening to the UK and its public services (in which I’d include higher education), claiming that ‘the pendulum of business is swinging back to cost consciousness and efficiency’ before going on to offer advice on how to achieve this including:

No company can afford to overlook the transformative power of conventional and generative AI. Organisations typically see AI as a growth driver but tend to overlook its tremendous potential in cost management and operational efficiency.

This brings out an interesting perspective for UK higher education. 

a starting point

There has been an explosion of blogs, articles etc. on the application of AI to what we do in our universities, largely relating to core academic activity.  The initial interest and concern about the implications of AI for academic integrity has been joined by a focus on implications for curricula, skills development, and many other areas of universities’ academic activities.

As well as looking at these in their own right, we are starting to see work on the implications of AI for academic staff – most notably the recent article by Watermeyer, Phipps, Lanclos and Knight on Generative AI and the Automating of Academia.  This was based on a survey covering both academic and professional services staff, and we’re promised future publications based on the responses from the latter.

That’s very welcome, as to date I’ve not seen much published on the implications of AI for professional services in universities.

It feels like we’re at an inflexion point for professional services in higher education.  Clearly AI will transform the way we run ourselves as universities, particularly in respect of professional services.  And this raises questions we need to think about before we leap into this transition, rather than after the event.

economic benefit

Of course the common expectation is that AI will lead to greater efficiencies, as routine work is automated.

However, the impact of new technology can be unpredictable.  As Tim Harford has highlighted, expectations in the 1980s that the spreadsheet would lead to decimation of the accountancy profession proved wide of the mark: ‘there are more accountants than ever; they are merely outsourcing the arithmetic to the machine’.

Harford suggests that this was because spreadsheets were:

a technology that automated the more tedious tasks in accountancy, burnishing jobs that were already well-paid and interesting.  It may be that generative AI does something similar on a grander scale, letting the humans deal with the big creative questions while the machine handles the nagging details.

I know that to date many in the sector have suggested something like this will happen as AI is applied to how universities run themselves: resource will be freed up to allow the creation of higher quality, more value-adding roles.

But will it, in the context of the huge financial pressures so many universities are now under?  Or will universities be forced, or choose, to make these efficiencies without redirecting the resource that has been freed up to other activities and roles?  If it’s the latter in addition to the impacts on individuals and organisational practice and culture, what might be the impact of this in local labour markets? In many of these, professional services roles at the local university are a significant element (particularly of higher quality jobs) so what might the rise of AI mean for the economic benefit to local communities of having a university in their midst?

values-driven

Universities would like to see themselves as values driven organisations.  And rightly so given their status as charities, and the resulting obligation to deliver public benefit. This raises important issues as we increase the use of AI in our universities.

Lawrie Phipps, for instance, has drawn attention to the need to pay attention to the ethical challenges arising from the use of AI in our education and research, and the opposition that calls to do this can encounter.

Ethical issues are just as relevant in any increase in the use of AI to support how we run ourselves.  For example, the working conditions of the data workers critical to LLMs, to which Helen Beetham has drawn attention, is something that should factor into the approaches that universities take, given their commitments to social justice.  And with our institutional targets on Net Zero, the environmental impact of AI is another issue we need to consider and address in deciding where and how we employ AI to run our universities.

And these are just two of many potential ethical issues that need to be considered.

people-focused (1)

We also need to think through the data sharing and security implications of the ways that we use AI to run our professional services, and act on the conclusions we reach.

A recent study has shown how Large Language Models (LLM) have been used by care providers to create care plans for their patients, in effect feeding the LLM confidential private data.  As well as generating the care plans requested, the LLM will also use the data it is given as training data – and potentially reveal this confidential information to others from outside the care provider.

In a phrase I’ve seen a fair amount recently, giving data to an LLM is like emailing it to the world.

Universities hold and process significant amounts of personal data.  Many of the ways in which we will use LLMs to run ourselves will not involve personal data, but some may. Where it does, we need to pay attention to the data management issues. Yes, to ensure that we’re compliant with regulation.  But much more importantly to meet the aim of that regulation – to ensure we do not cause harm by sharing personal data with those who should not have it.

We need to think this through. 

(And in the meantime keep our fingers crossed that initiatives in LLM use that have taken place ‘below the radar’ don’t mean that this horse has, in some cases, already bolted).

people-focused (2)

Another aspect that we need carefully to think about is the impact of AI on the development of professional knowledge and expertise in our professional services.

In many instances knowledge is developed and internalised, and understanding achieved, through practice – by someone typically starting with more routine activities, before progressing to less routine ones. And often it is the deeper knowledge and understanding that is gained through this practice that allows us either to identify when what appears routine actually isn’t; or to respond effectively when things go wrong.

There is significant potential for AI to reduce more routine, labour intensive work.  We do, though, need to think carefully about how ‘routine work’ is defined so that we don’t inadvertently remove the opportunities for people in professional services to develop through practice the types of knowledge, competence and understanding that will allow them rather than the AI to have agency and control of processes and the implementation of policy.

intentional and considered

None of which is to suggest that incorporating AI into how we work in professional services is something we can or should seek to avoid.  It is necessary, essential and inevitable.

But we need to do this in a thoughtful and considered way that is true to our values and aims, ensuring that we avoid the potential pitfalls.  It’s critical that we think and work through these issues now, and not leave it until after event – when choices (and potentially mistakes) have been made that perhaps can’t be undone.

One response to “after the event”

Leave a comment