Article

Saviour or siren? Considering the uneasy relationship between academia and ChatGPT and the search for a middle ground

16 March 2023 | Applicable law: England and Wales | 7 minute read

It was with a certain amount of trepidation that I attempted my first prompt using the new chatbot, ChatGPT,1 at the end of last year, to find out whether it could convincingly answer a legal question. I purposely asked something very straightforward and factual – about governor payments – and (perhaps to the relief of all lawyers) the answer was mediocre.

I wouldn't say it was actively wrong, but it could not provide more than a sensible Google search would have produced. A simple factual question is, of course, only the tip of the iceberg, for questions of this nature rarely reach our desks. It was clear that, at least in its current guise, ChatGPT is not capable of the level of complex analysis and, dare I say it, creative thinking, which a lawyer would require. Here was a lesson learnt: if you have expertise in an area, you will probably be able to see the limitations of the results before you, but if you are a novice, there is a risk of relying too heavily on its content.

 And so it is that schools and universities have had to turn their attention to the 'problem', or perhaps 'opportunity', presented by ChatGPT. In a world where knowledge and expertise are placed in high esteem, could ChatGPT devalue these currencies and lead to a total change in how students are taught and examined? Should teachers and professors across the academic spectrum be nervous that a machine will take their jobs away? And will students graduate with the resilience, ownership and deep level knowledge of previous generations, if any question they may be asked can be answered on their phones? Could ChatGPT deprive younger generations of the satisfaction of hard work and intellectual rigor, or will it save them time and enable them to focus on other, now more useful, tasks? And frankly, is the use of ChatGPT simply cheating?

 While AI has a meaningful role in other spheres: it can help pilots fly planes and rockets, assist traders navigate the stock markets and outplay grand masters at chess, the world of academia, journalism and even poetry has never been threatened in the same way. Perhaps, in part, this is why Chat GPT has caused so much controversy. This has resulted in polarised views across the academic sphere, with the two main camps, for and against, producing different policies depending on which school or university you look at. It is currently believed that almost 40% of all UK universities have either banned ChatGPT or are in the process of revising their policies to prevent its use for assessed assignments. Some schools have 'cancelled prep' while others may move their curriculum away from coursework in favour of examination conditions.

 'Traditionalists' (for want of a better term) would argue that using ChatGPT is cheating and ought to be banned in order to ensure students obtain full value from their courses. Access to ChatGPT is like handing them a potentially inaccurate answer book along with the question. Some learning may follow but there is a danger of over reliance on an unreliable source. Traditionalists will point out its flaws and the damage it could do to the education of budding young minds.

 On the opposite side are the 'progressives' (if I may call them that) who would argue that as Chat GPT is here to stay and only likely to get better, we might as well benefit from its uses and integrate it into the way students learn. And there would be no controversy in saying that there are benefits to AI in education, so perhaps it is better not to bury our heads in the sand. A high-profile example of this would be the International Baccalaureate, which will permit Chat GPT to be used provided any quotes are correctly attributed. Matt Glanville, Head of Assessment Principles and Practice at the IB, likened ChatGPT to the use of calculators or spell-checking software.

 There may be a middle ground, where conversations with students and their teachers should be had, to explain the strengths and weaknesses of AI and how best to use it. And while it can be an enormous time saver for certain straightforward questions, students need to understand the impact Chat GPT may have on their learning, both in terms of quality and accuracy.

 Perhaps its greatest weakness, which the makers of ChatGPT never denied, is that it can be inaccurate. And given that arguably its greatest strength is natural language processing (i.e., the ability to communicate like a human), the great danger is that a student (or teacher) may rely heavily on its convincing responses, much like they might rely on the opinion of a person who may not fully know their subject but has the arrogance to speak like they do. In human terms, we call this 'blagging' and Chat GPT is equally capable of this. By way of an example, a major criticism of ChatGPT is that it may resist providing source information and has been reported to invent fictitious sources when asked for its references. In addition, one teacher commented that, in attempting to see if ChatGPT was capable of marking their student's exams, they inputted the same exam response multiple times and, on each occasion, ChatGPT gave the student a different mark.

 Open AI's CEO, Sam Altman said “ChatGPT is incredibly limited, but good enough at some things to create a misleading impression of greatness… it’s a mistake to be relying on it for anything important right now.” He added, “It’s a preview of progress; we have lots of work to do on robustness and truthfulness.”

 There are also question marks around whether ChatGPT could be considered a biased source. It is only as strong as it its algorithm and the teaching materials used to train it. If those materials favour or focus on certain groups or topics over others (something we cannot tell), then anyone relying on its output may inadvertently be propagating these biases further. In December, a UC Berkeley professor, Steven Piantadosi, who asked ChatGPT to produce a programming code to check if someone would be a good scientist based on their race and gender, confirmed on Twitter that it had predicted that good scientists are white and male.

 While ChatGPT will almost certainly continue to strengthen its offering (and thus, any consideration of its use must be a continuing conversation), its current weaknesses are important to factor into any discussion with teachers and students. It may sound like a panacea but at present:

  • it may be a biased source
  • it has the capacity to be inaccurate while appearing convincingly correct
  • we cannot easily tell the source materials it relied upon to generate its responses and some users say it has made up fictitious sources when asked
  • the answer received is only as good as the prompts inputted by the user – the user will need a certain amount of knowledge to ask the right questions in order to obtain the right answer, as well as to discern whether the answer is correct or not
  • answers may be inconsistent; a user can ask ChatGPT the same question multiple times and receive a different answer each time.

 And yet, it is unquestionable that ChatGPT is useful. Perhaps university level scholarship and the legal profession need not yet worry about its abilities, but if a user is looking for a simpler, let's say, average GCSE standard response, then it has much to offer. It can simplify complex language for a student who may be struggling to understand a difficult piece of text or summarise an article in a more meaningful way. In that way, it can act as a virtual assistant to smooth the journey for a student who needs a little extra support. It is comparable to Google, but surely no respectable student would dare to simply quote 'Google' in their bibliography. So it is with ChatGPT – we would not recommend using it as a serious source in its own right for academic purposes, not so much to confront the problem of 'cheating' but simply because any student carrying out research would be better off reading a range of texts produced by genuine experts in their fields. ChatGPT may signpost them in the right direction, but ideally would not be the student's chosen end destination.

 Once a reasoned discussion with students takes place, they may be better able to discern when its use is justified for saving time and acting as a virtual assistant, and when it would be actively detrimental to both their learning and their futures to promote it to the role of 'imposter', where it acts as their mouthpiece.  They cannot take it with them into an examination and they may not be aware that new software like ChatGPT Zero is now available to help education institutions spot when AI has been used to generate an answer. While these programs will need to evolve, they are already being used within schools and universities to great effect in checking longer documents like coursework and essays.  

 Ultimately, students can always find ways to cheat if they are determined to. Relying on a friend, relative or essay mill to 'do your coursework' has no plagiarism software attached to it beyond the suspicions of the teacher. While ChatGPT may make cheating more accessible and covert, students who fully understand its uses and limitations are, hopefully, better placed to disregard it for this purpose.

 With particular thanks to Charles Wallendahl, Head of Theology, Philosophy & Ethics at St Edward's School, Oxford, specialist in the relationship between education and technology.

1 - Chat Generative Pre-trained Transformer, more familiarly known as ChatGPT, is an online artificial intelligence chatbot that was launched in November 2022 by the artificial intelligence laboratory, OpenAI. ChatGPT's inception has created waves in the technology sector due to its incredible versatility and ability to create content and interactions that appear more authentically "human" than previous artificial intelligence models. For example, ChatGPT can write and debug computer programs, write poems or film scripts and, compose music while emulating the style of famous writers and composers. In its current guise, it is relying on data produced up to 2021. 

This document (and any information accessed through links in this document) is provided for information purposes only and does not constitute legal advice. Professional legal advice should be obtained before taking or refraining from any action as a result of the contents of this document.

Share

Related experience

As a full-service law firm, we are able to provide advice and information about a wide range of other issues. Here are some related areas.

Join the club

We have lots more news and information that you'll find informative and useful. Let us know what you're interested in and we'll keep you up to date on the issues that matter to you.