How should universities respond to robot writing?

“At one end of the spectrum are ‘the accommodators’ who see the inevitable rise of AI and conclude that fighting it is pointless. But this is a false dichotomy”

The arrival of automated essay-writing software has sent shockwaves through the global higher education sector. Academics and administrators are urgently debating how to respond to a technology that could make cheating a run-of-the-mill, free, and potentially acceptable behaviour for millions of university students.

Just last year Australia’s higher education regulator, TEQSA, was busy blocking access to scores of essay mills – websites that offer to write essays for students – usually for a few hundred dollars, with turnaround times of 24 hours to two weeks. That response now feels like it came from a bygone era, in the face of the game-changing ChatGPT, the new AI algorithm that can respond to nearly any prompt by spitting out original text right before one’s eyes.

At a gathering of education leaders in Sydney last month, the tension in the room was driven entirely from AI. Across academia, from North America to Europe, to Oceania, the spectrum of responses tends to fall into two camps. At one end of the spectrum are a group best described as “the enforcers”. This group sees some form of punishment as the only logical response to any breach of academic integrity rules. Students who break the rules must face consequences, and, if needed, universities should revert to “unhackable” assessments, in the form of face-to-face assessments, featuring pen and paper.

At the other end of the spectrum are “the accommodators” who see the inevitable rise of AI and conclude that fighting it is pointless. Better to accept the arrival of our computer overlords and try to think about ways to collaborate with them for educationally constructive purposes. However, this is a false dichotomy that fails to understand how we should think about ChatGPT, and how we should respond to its arrival.

What we should be asking ourselves is ‘Do we want to offload the intellectual burden of writing an essay, or even just a first draft?’ Writing is how we discover what we think about whatever topic we have been studying. There is nothing more fundamental about learning – and no skill more important to most knowledge-economy careers – than producing a coherent, well-argued, grammatically correct piece of writing.

Writing is also one of the hardest skills to learn, which is why watching ChatGPT produce its writing in real time is incredibly mesmerising and, for most of us who struggle to produce output, jealousy-inducing. It is also deeply troubling as the makers of ChatGPT acknowledge that the tool has no understanding of truth and is unreliable in that it will give different answers to the same question. In short, there is no “intelligence” in ChatGPT. There is only imitation.

So how should we respond as educators? Already some universities are going down the “enforcer” path, seeking to block access to ChatGPT, defining use of any computer-generated content as a breach of academic integrity rules and signalling that students who do so may be severely punished. And while this is an understandable response, it is only dealing with the consequences, not the cause of academic dishonesty.

Perhaps a tool will be developed that can identify AI-generated content. A new bot will surely come along that will defeat such a tool, and the spy-versus-spy arms race will continue ad-infinitum.

Better to start with the causes of academic dishonesty. If we can mitigate them, then students will be far less likely to turn to the dark side – whether that be copying text off the internet, paying a third party for an essay, or using an AI-bot to slap 1,500 words together in a matter of seconds. Put simply, students who feel cheated by their institution are more likely to cheat.

 We now have a massified higher education system where investment in the student experience has failed to keep pace with technology and student needs. Why does an improved student learning experience, including satisfaction metrics, staff wellbeing, and critical skills scaffolding, matter so much? Because students who feel their teachers know them and care about them are far less likely to take a shortcut to pass a unit or cheat their way to a degree, especially when faced with enormous financial or societal pressures to simply pass.

It is understandable why universities are viewing the arrival of robot writing as an existential crisis for the sector and it is quite possible that enormous numbers of students will be tempted to cross the line with a tool like ChatGPT. The sector must up its investment in teaching and learning, student well-being, and belonging.

About the author: Jack Goodman is founder and chair of Studiosity, based in Australia and London. On February 28, Studiosity is hosting a free online UK higher education’s thoughtful response to robot writing symposium featuring leaders from Coventry University, Kingston University and University of Exeter.