Universities Drop AI Detection Tools, Cite Ineffectiveness

Several New Zealand universities have abandoned the use of software designed to detect Artificial Intelligence in students’ work, citing unreliability and the need for new approaches to academic assessment. Reported by RNZ
Massey University recently confirmed it no longer uses AI detection tools, following earlier moves by the University of Auckland and Victoria University of Wellington. For Massey, the decision came after it also stopped using automated exam monitoring systems last year due to a major technical failure.
A Massey spokesperson told RNZ that detection tools were ineffective, noting students were already allowed to use AI responsibly in much of their coursework.
Dr Angela Feekery, president of Massey’s Tertiary Education Union branch, said academics had been inconsistent with the technology. Some used results as a guideline, while others accused students of cheating if a certain percentage of AI-generated content was flagged.
“Research shows AI detection doesn’t work well,” Feekery said. “Students can easily bypass these systems. Turning them off was the only sensible option.”
She noted academics could still identify AI misuse through methods like checking version histories or relying on professional judgement developed through years of marking. However, she admitted universities were still grappling with how best to assess students in the age of generative AI.
At the University of Auckland, graduate teaching assistant Java Grant said Massey’s stance made sense. “It’s extremely difficult to prove if work is AI-generated unless it includes obvious signs. Many instructors are now moving to in-person, paper-based assessments to reduce risks, though it increases workloads significantly,” he said.
Computer science lecturer Dr Ulrich Speidel added that remote exams remained vulnerable. Students could use second devices or outside help, with as many as 30–60 percent likely to do so, based on his experience and research. Automated monitoring, he said, was also prone to hacking.
Massey confirmed its online assessments are not invigilated but form part of a broader system ensuring student work is validated at key stages. The university said it no longer relies on unreliable detection, instead focusing on secured assessments where AI use is prohibited, such as labs, oral exams, and studio-based work.
“Turning away from detection does not mean we are handing academic work to AI,” Massey said. “Rather, we recognise the environment is shifting. We are developing AI literacies so students can use these tools ethically and responsibly, while maintaining academic integrity.”
Approaches to AI
How the eight universities approach online exam security and detection of AI in student work:
Auckland
- Uses online invigilation for remote exams.
- Does not endorse AI-detection tools.
AUT
- Does not run remote, online examinations.
- Unclear whether it uses AI detection software for student work.
Waikato
- Conducts some exams online and some remotely.
- Uses AI-writing detection tool.
Massey
- Offers remote, online open book assessments and tests without automated monitoring.
- Does not use software to check for AI use in student work.
Canterbury
- Uses monitoring tools for online assessments.
Lincoln
- Uses videoconferencing technology to monitor remote online exams.
- Uses software to check for AI use in student work.
Victoria
- Seldom uses digital exams and does not use online proctoring.
- Does not use AI detection.
Otago
- Has very few digital exams.
Several New Zealand universities have abandoned the use of software designed to detect Artificial Intelligence in students’ work, citing unreliability and the need for new approaches to academic assessment. Reported by RNZ
{% module_block module "widget_6125f929-3b32-4ec1-8839-8aa911bfd958" %}{%...Several New Zealand universities have abandoned the use of software designed to detect Artificial Intelligence in students’ work, citing unreliability and the need for new approaches to academic assessment. Reported by RNZ
Massey University recently confirmed it no longer uses AI detection tools, following earlier moves by the University of Auckland and Victoria University of Wellington. For Massey, the decision came after it also stopped using automated exam monitoring systems last year due to a major technical failure.
A Massey spokesperson told RNZ that detection tools were ineffective, noting students were already allowed to use AI responsibly in much of their coursework.
Dr Angela Feekery, president of Massey’s Tertiary Education Union branch, said academics had been inconsistent with the technology. Some used results as a guideline, while others accused students of cheating if a certain percentage of AI-generated content was flagged.
“Research shows AI detection doesn’t work well,” Feekery said. “Students can easily bypass these systems. Turning them off was the only sensible option.”
She noted academics could still identify AI misuse through methods like checking version histories or relying on professional judgement developed through years of marking. However, she admitted universities were still grappling with how best to assess students in the age of generative AI.
At the University of Auckland, graduate teaching assistant Java Grant said Massey’s stance made sense. “It’s extremely difficult to prove if work is AI-generated unless it includes obvious signs. Many instructors are now moving to in-person, paper-based assessments to reduce risks, though it increases workloads significantly,” he said.
Computer science lecturer Dr Ulrich Speidel added that remote exams remained vulnerable. Students could use second devices or outside help, with as many as 30–60 percent likely to do so, based on his experience and research. Automated monitoring, he said, was also prone to hacking.
Massey confirmed its online assessments are not invigilated but form part of a broader system ensuring student work is validated at key stages. The university said it no longer relies on unreliable detection, instead focusing on secured assessments where AI use is prohibited, such as labs, oral exams, and studio-based work.
“Turning away from detection does not mean we are handing academic work to AI,” Massey said. “Rather, we recognise the environment is shifting. We are developing AI literacies so students can use these tools ethically and responsibly, while maintaining academic integrity.”
Approaches to AI
How the eight universities approach online exam security and detection of AI in student work:
Auckland
- Uses online invigilation for remote exams.
- Does not endorse AI-detection tools.
AUT
- Does not run remote, online examinations.
- Unclear whether it uses AI detection software for student work.
Waikato
- Conducts some exams online and some remotely.
- Uses AI-writing detection tool.
Massey
- Offers remote, online open book assessments and tests without automated monitoring.
- Does not use software to check for AI use in student work.
Canterbury
- Uses monitoring tools for online assessments.
Lincoln
- Uses videoconferencing technology to monitor remote online exams.
- Uses software to check for AI use in student work.
Victoria
- Seldom uses digital exams and does not use online proctoring.
- Does not use AI detection.
Otago
- Has very few digital exams.
Leave a Comment