Use of ChatGPT in universities: 'We're much closer to a crisis than we think' – UJ Professor
If students are relying on AI to complete their assignments, are they truly acquiring the skills their qualifications are meant to certify?
Picture: Pixabay.com
CapeTalk's John Maytham interviews Andy Carolin, Associate Professor in the Department of English at the University of Johannesburg.
Listen below:
As generative Artificial Intelligence (AI) tools like ChatGPT become increasingly accessible, universities are confronting a difficult question: can they still ensure the integrity of a degree?
For some, AI represents a powerful tool for innovation and learning.
For others, it poses a threat to the values at the heart of higher education.
RELATED: 'Artificial Intelligence is being used as shield for incompetence' – World Wide Worx
If students are relying on AI to complete their assignments, are they truly acquiring the skills their qualifications are meant to certify?
“We’re much closer to a crisis than I think many of us are willing to acknowledge,” warns Carolin.
He says that the challenge is compounded by the fact that institutions currently lack reliable methods to detect the use of tools like ChatGPT.
"It's not just that students aren't attempting to write their own work, what we're also starting to find is that students aren't even attempting to read the work."
- Andy Carolin, Associate Professor in the Department of English – University of Johannesburg
Carolin says that this means students may be graduating without ever having been properly assessed – or taught how to think critically.
He argues that is a 'crisis' not just for students, but for the economy and the future of higher education itself.
"Of course AI skills are necessary and they should be taught, but there's no reason that AI should erase and force all the other disciplines to almost collapse under it."
- Andy Carolin, Associate Professor in the Department of English – University of Johannesburg
Scroll up to the audio player to listen to the interview.