Omnibus prik

I’M ALLOWED TO USE GAI IN MY STUDIES AND FOR MY EXAMS - BUT SHOULD I?

For newer students at Aarhus University, GAI has been an option since the start of their studies. Various chatbots can summarise lengthy course texts, offer feedback on assignments, and break down complex topics into digestible chunks. But digestible chunks alone don’t make a balanced academic diet, several AU researchers with expertise in generative AI and learning say.

GAI stands for Generative Artificial Intelligence and has been permitted for use by students since the start of the academic year in 2024. Photo: AU Photo/Jens Hartmann

Since the beginning of the autumn semester in 2024, the use of GAI has, in principle, been permitted at Aarhus University as a tool for academic work – including exams, Bachelor’s projects, and Master's theses. 

GAI has been given its own place under the "Learning tools" tab on Studypedia, Aarhus University's reference work for student life. The tab is right next to Brightspace. 

Here, you can find useful advice on using GAI and read about chatbots and their various features.

But just because GAI is allowed, does that make it a good idea? Omnibus posed this question to a number of AU researchers with insight into the interaction between learning and GAI. 

“LEARNING CAN BE TEDIOUS, LEARNING IS DIFFICULT, LEARNING REQUIRES FRICTION.”

According to Carsten Bergenholtz, associate professor at the Department of Management, chatbots can be a useful tool. 

Carsten Bergenholtz teaches sixth-semester business administration students in the subject Scientific methodology  at Aarhus BSS and is also a member of the working group that contributed to the development of AU's official guidelines for the use of GAI. In his own teaching, he has created a chatbot for the subject in question, which responds based on the syllabus and provides references to the syllabus at the same time. 

"I have created a chatbot for my students because it provides answers from the course syllabus and helps them understand what they don't understand," he says. 

“A chatbot can also be useful for providing feedback before oral examinations. There isn't much feedback from teachers on programmes with large classes, so it's effective at posing exam-relevant questions and providing constructive feedback on your responses,” he says. 

Carsten Bergenholtz understands why some students use GAI to save time reading and solving cases.

But that doesn’t teach the student anything, he believes. 

"If you get it to read research articles, solve cases and write half of your assignment, you may get a diploma certifying that you have passed the course, but you are shooting yourself in the foot," Carsten Bergenholtz says. 

“Learning can be tedious, learning is difficult, learning requires friction. It’s hard to accept that it takes blood, sweat and tears to truly understand something,” he says. 

“When you’re presented with something in the syllabus that you don¨t understand, you must read it again and work harder to get the information to stick in your mind in the right way. Essentially, it is the neurons in the brain that need to be moved around," he says.

"The students don't know everything. Otherwise, there would be no point in studying. That's why it's incredibly easy to fall into the trap of relying on chatbots to give you easy-to-read answers," Carsten Bergenholtz says.  

ADVANTAGE FOR THE ACADEMICALLY STRONG – THE WEAKER RISK LAGGING BEHIND

Cathrine Hasse, professor of educational anthropology at the Danish School of Education at Aarhus University in Emdrup, also understands the dilemma. 

Together with fellow researchers Maja Hojer Bruun and Jakob Krause Jensen, she has researched students' use of GAI in their studies in the humanities. 

"The students use it in very different ways. Those who use it effectively maintain a professional mindset. They understand how to ask the chatbot questions in the right way so that it can supplement their professional knowledge," she explains. 

"Those who are not as academically strong risk lagging even further behind because their academic skills become dull. They’re simply not as good at prompting it properly and take what it says at face value," Cathrine Hasse says. 

The significant difference in the use of chatbots places students in very unequal positions and reinforces existing academic divides, according to Cathrine Hasse.

"GAI is actually quite unsuitable for mass education, as practised at universities, because the strong students get a boost, while the weak ones fall behind," she says. 

"It's important to know how to use it," Cathrine Hasse says. 

TEACHERS SHOULD GUIDE STUDENTS

Exactly this is also the starting point for Madeline Marie Marello, a PhD student in psychology. In her course Applied Gerontology: Health and Well-being in Older Adulthood, she introduced her students to various chatbots last autumn to demonstrate their flaws and shortcomings. 

As Cathrine Hasse points out, there was also a significant difference among Madeline Marie Marello's students in how they utilised GAI. 

“I wanted to even the playfield a bit,” Madeline Marie Marello says.

“I wanted to demonstrate the downfalls and introduce the newbies and highlight that it’s not going to write your paper for you,” she says.

Each student used different chatbots and compared the results. 

None of the results were particularly good, according to Madeline Marie Marello. 

It was a workshop at AU Library that inspired Madeline Marie Marello to introduce her students to GAI in her teaching. 

“It’s (ChatGPT, ed.) really good at summarising, but it’s not going to do any critical thinking for you,” she says. 

“It’s a good tool to maybe point you in the right direction, but it’s not accurate,” she says.

HOW SHOULD A PRINCIPLED STUDENT RESPOND WHEN AI MAKES EXAMS MORE DEMANDING?

As mentioned, GAI and all its functions are permitted as a starting point, unless otherwise stated in the course description. 

This means that it’s up to the students to decide how and to what extent they use GAI. As long as its use is declared, and generative AI is not explicitly included in the methodology section of the exam assignment, the student's written work will not be assessed based on their use of AI. 

That said, it’s of course not permitted to plagiarise or fail to reference text generated by GAI simply because you declare it. 

Carsten Bergenholtz believes that this may affect students differently.

“There are the conscientious students who want to do things themselves, and then there are those who take a shortcut through a chatbot – and that’s perfectly legal. It must be frustrating," Carsten Bergenholtz says.

“It’s not for some higher spiritual purpose that students should think for themselves. That's because that's how you learn," he says. 

"If, as a student, I declare that I have used the chatbot for the core learning objectives in a subject, then surely that affects what I have learned? Shouldn't that also affect the grade? I can understand why the students are uncertain about this," Carsten Bergenholtz says. 

Cathrine Hasse understands the dilemma and confusion.

"The students are very much left to their own devices. Very few teachers know anything about this. “It’s a skill employers will be looking for,” Cathrine Hasse says. 

However, Catrine Hasse emphasises that students may be cheating themselves if they consider the chatbot to be a time-saving tool in their studies. 

"Students must actually adhere to academic standards even more strictly," she says. 

She also predicts that the use of GAI will result in increased expenditure for educational institutions: "This can never be a cost-saving practice – quite the contrary. We need more resources to upgrade the students' skills," she says. 

"We need more resources to reintroduce double testing, where you defend a written exam orally," she says. 

However, Carsten Bergenholtz believes that resources for more oral examinations are difficult to find.

“Examinations that require a lot of supervision or oral examinations are very expensive. Not everyone always has the resources to do so," he explains.

GAI SHOULD NOT BE SENT OUTSIDE THE DOOR, BUT USED WITH CARE

Cathrine Hasse and her colleagues are currently working on guidelines for students of educational anthropology at the Danish School of Education. The guidelines have not yet been adopted. 

Here, students are encouraged to read or write on their own before asking the chatbot. In addition, students should be aware of easy shortcuts, it reads. 

In addition, the draft guidelines mention that a GAI-generated summary of a text can’t provide students with anything other than superficial knowledge. 

However, GAI can be a helpful translator, according to both the draft guidelines and Cathrine Hasse herself.

“They are very good at translations. That's how I feel I've benefited from it," she says. 

Carsten Bergenholtz also emphasises that a chatbot can be used in the study process, but this requires students to know how.

"I strongly believe that all students should gain an insight into what they need to be aware of if they want to use AI appropriately," he says. 

"It should be centralised and should be a module that students take in their first semester," Carsten Bergenholtz says. 

"We shouldn't be creating students who are dependent on a chatbot, but rather students who know what they can do with a chatbot," he says. 

WHAT DOES THE CHATBOT ITSELF SAY?

Omnibus has also asked Microsoft Copilot to address the dilemma. And hey, it's perfectly legit: GAI is, in principle, permitted for student use at my university – but should it be used? Are you cheating yourself? What do you think about that dilemma?

It took about ten seconds for it to list the advantages and disadvantages of GAI in academic work. 

"You're not necessarily cheating yourself – but it depends on how you use GAI," it spits out. 

According to Copilot, the advantages of using GAI in your studies include support for understanding, writing assistance, and time optimisation, with the argument that if GAI summarises the texts and generates ideas, the student has more time for reflection and critical thinking. 

Cathrine Hasse and Carsten Bergenholtz would both disagree with this.

They both express the view that students will not learn anything if they are simply given a GAI-generated summary of the syllabus. 

The chatbot also mentions a number of risks associated with the use of GAI. These include the risk of superficial learning, dependency, and venturing into an ethical grey area, as well as plagiarism and a lack of academic integrity. 

Therefore, according to Microsoft Copilot, it’s not a question of either/or, but rather how GAI is used. It suggests that students can use GAI as a sparring partner and encourages them to reflect on the output they are given. In addition, Copilot encourages students to check their university's guidelines for using GAI. 

This text is machine translated and post-edited by Cecillia Jensen