In New Zealand, lecturers have proposed changing the way assessments are done in universities in reaction to the emergence of AI-enabled generative tools. According to Chris Whelan, Executive Director of Universities New Zealand, universities and schools should reconsider how they evaluate students.
Paul Geertsema, a Senior Lecturer at the University of Auckland, responds similarly. He agrees that colleges need to rethink their approach to using essays as a form of evaluation. “The reason why we have pupils write essays should be reevaluated. We need to rethink what it is that we want people to be able to do because these systems can generate remarkable essays on most subjects.”
New Zealand has put in place an update for anti-plagiarism software that can recognise AI-generated material with 98% accuracy to prevent the usage of the platform for students to cheat on their assignments. With the activation of software that can detect the use of Artificial Intelligence systems, the struggle against plagiarism or non-original content has been given a significant boost.
Nevertheless, Chris acknowledges AI tools as powerful and advantageous for society. Indeed, US surveys show 25–50% of college students have utilised AI for various tasks. Therefore, he supports any initiatives implemented in business and the workplace to increase effective competition and create wealth for the community.
Accepting its validity and using it in testing procedures is consistent with international standards. He encourages assigning tasks that allow using generative AI under specified conditions and then having students evaluate the first draught as an assignment in itself. However, he noted that in other cases, assignments that generative AI cannot duplicate can be given.
According to the Head of the Secondary Principals Association, Vaughan Couillault, it would be challenging for students to get away with utilising artificial intelligence to create homework because their teachers could tell if the work was above par. He suggested that educators require pupils to present their rough draughts or assign work that must be finished in class rather than at home.
The update to the software will help catch misuse of AI, believes Simon McCallum, a Senior Lecturer in software engineering at Victoria University. Yet, he expects it to only be helpful for a short period and when students do it amateurishly.
“Problematically, if you’re going to report someone for copyright infringement, you need to have solid evidence that they violated the regulations, not simply a hunch that it was probably an AI that produced this,” he said. “There’s a whole bunch of these other AI tools that, when you layer them on top of each other, make any of those identifications much less certain.”
Dr McCallum knows many pupils use generative tools and other “large language model” applications. They used it often and some of them are becoming quite skilled at it. “Those that are skilled in prompt engineering and can provide high-quality input to the AI will be able to generate highly realistic results that are far superior to their original tasks with much less time.”
He argues that since AI would be widely utilised in various occupations, students should learn how to use it, but he was concerned that some needed to do it mindlessly. He feels said that the issue is so crucial that the entire educational system should close so that lecturers and teachers could figure out what to do.