Several cases of suspected chatbot cheating at AU: What you need to know before the summer exams
For now, AU is taking a hard line on the use of new artificial intelligence chatbots, and it will continue to do so throughout the summer exam season. There are already a handful of cases of suspected chatbot cheating at the university. Students think the rules for using chatbots are too restrictive.
Since ChatGPT was launched last November, chatbots – also known as large language models – have been actively debated at AU. What can they be used for, and how can we use them? Should we be using them at all? Are they a threat? Or a tool? Do they even have a place at a university?
A few months after ChatGPT was launched, AU took the provisional decision to regard the use of chatbots and all other forms of artificial intelligence as exam cheating, unless “the course description specifically states that the use of AI software is permitted in the exam”.
This is still the university’s position, now that the summer exams are just around the corner, explains Anna Bak Maigaard, who is deputy director of AU Student Administration and Services.
“It is important that students complete their exam assignments independently and individually. The general rule is that students may not use chatbots or similar technologies that can generate text, code, illustrations, images, graphic overviews or tables. Unless it is stated in the course description that they may. This means that the board of studies can choose to allow the use of chatbots or similar technologies when it makes good academic sense to do so. So students should check the course description, and, if there is no mention of using chatbots in the exam, they may not use ChatGPT or similar technologies,” says Anna Bak Maigaard.
Students are using ChatGPT
Several members of teaching staff have started to use artificial intelligence tools both in their own preparation and as a resource for students. So it’s safe to assume that students are also using chatbots themselves. Carsten Bergenholtz, associate professor at Aarhus BSS, asked his students how many of them used chatbots, and 25 per cent responded that they use chatbots regularly, while 50 per cent responded that they use them occasionally. 200 out of 550 students responded to his survey.
A survey conducted at universities in Sweden shows that 35 per cent of the 5,894 students surveyed are familiar with and regularly use chatbots. 28 per cent know about them but rarely use them.
On the Reddit website, a user recently started a thread asking whether people thought AU would run students’ exam assignments through AI detection software. The thread has received 81 comments.
Although AU has banned the use of artificial intelligence for the time being, it has put together some helpful material on how to use chatbots like ChatGPT, in which it adopts a constructive approach to the technology. For example, on AU Studypedia, it states that “chatbots can be a useful resource for you as a student if you use them intelligently, responsibly and critically”. The same website also contains quotations from students describing how they use chatbots.
The use of chatbots – in particular ChatGPT – in different situations has given rise to several grey areas. For example, when students complete a take-home exam assignment, AU cannot control which tools they use, and is there really a difference between asking a question to ChatGPT or to your roommate? In the same way, some people view ChatGPT as an advanced version of Google, which we must assume students use when completing exam assignments at home.
Anna Bak Maigaard is aware of these grey areas, but she emphasises that using chatbots for exams – unless stated in the course description – will be regarded as cheating.
A handful of cases of suspected cheating
Anna Bak Maigaard confirms that there is “a handful of cases” of suspected chatbot cheating at AU.
“At the moment we are processing a number of cases in which we suspect students have used chatbots to answer some or all of their exam question. These cases are being handled in the same way as all other cases of suspected cheating; the principle at stake is whether the students have completed the work themselves,” she says.
The deputy director would prefer not to comment on the specific way cases are processed.
Student Council: The rules are too restrictive
Jeanette Kusk, chair of the Student Council at AU, explains that the council understands why the university initially had to adopt a restrictive approach to chatbots. But the Student Council is not concerned about new artificial intelligence technology; on the contrary, they fully expect that it will become part of the students’ daily lives and later part of their professional lives on the labour market. For this reason, the council does not completely agree with the university’s current rules. The Student Council is itself drafting a paper in which they address the use of artificial intelligence in class and in exams.
“We think the rules are too restrictive, and we think there needs to be a larger conversation around the use of AI. But we understand that it’s difficult to make a decision about it now, and we’re pleased we’re able to have an honest debate about it,” says Jeanette Kusk.
She views ChatGPT as a kind of study mate who can help generate ideas and titles and provide an assignment with direction. Several students have already incorporated ChatGPT into their study routines, which is why it could prove problematic to cut them off from the technology when it comes to the summer exam period, Jeanette Kusk claims. She also points out that there are different types of exam. For take-home assignments, for example, it’s difficult to see how using ChatGPT is different from using Google as a means of acquiring background information.
“It’s interesting that the university is banning something that is actually just a machine that has collected a lot of data. This data is also available in the library or on the internet – it has just been put into a machine that finds the most probable answer. Students are of course not banned from going to the library, and it can be easier to get an overview of information by asking a chatbot,” says Jeanette Kusk.
She is in no doubt that some students will use chatbots in their exams this summer.
“I’m absolutely certain that some students that will end up using them even though they’re not allowed. Students shouldn’t intentionally break the rules, but I think people agree that it’s difficult to see why asking a chatbot is different from asking Google or a professor.”
“But using chatbots properly requires a wealth of digital skills, and this is something we need to discuss in detail. It will probably also be incorporated into courses on ethics and methodologies and into teaching more generally, so we can learn to use chatbots and discuss how we’re using them,” says Jeanette Kusk.
AU appoints an AI working group
In a column in Omnibus in February, Pro-rector Berit Eika engaged with the topic of artificial intelligence tools. She recognised that, even though AU has prohibited the use of chatbots in exams, going forward, the university needs to “adopt a much more multi-faceted approach to artificial intelligence”.
“Like all technology, ChatGPT is neither good nor bad in itself. Everything depends on how we use it. It makes no sense to be uncritically optimistic (we rarely are at universities). But we must also refrain from being overly sceptical and overlooking the potentials of this new technology”, she wrote in her column.
In March this year, AU’s Committee on Education, which is chaired by Berit Eika, decided to appoint an AI working group. The group will look into four different aspects of artificial intelligence: the technical aspects, the legal aspects, the exam-relevant aspects and the educational aspects. This includes identifying the advantages and disadvantages of chatbots and possible plagiarism detection tools that AU can employ. The working group is due to be appointed before the summer holiday.
Universities Denmark is also drafting a paper to outline their position on the new artificial intelligence technologies.