Sara-Jayne Makwala King4 July 2024 | 8:43

Students, here's why you shouldn't use AI to do your homework

'Sometimes it's great, but sometimes it will sell you up the river,' warns MyBroadband editor Jan Vermuelen.

Students, here's why you shouldn't use AI to do your homework

How ChatGPT visualizes itself

Clarence Ford speaks to Jan Vermeulen, editor at MyBroadband.

Click audio player below to listen

Unisa has been hit by a plagiarism scandal and hundreds of student's exam scripts marked as 'disciplinary cases.'

The University alleges that artificial intelligence tools such as ChatGPT, banned for the exams, were used by some students.

In a note to students last month the business management department said it was busy preparing 'dishonesty reports' for the assessment.

The advancement of AI in academia is often referred to as a 'double-edged sword' presenting both opportunities and challenges.

Vermuelen explains how AI tools like ChatGPT work:

"They scrape huge bodies of text and other media...and then build a model...you feed it, some kind of prompt and then it tries to guess about how to respond to that prompt."
- Jan Vermeulen, Editor - MyBroadband 

Something to be aware of, says Vermeulen, is that AI tools are information gatherers, they are not programmed to detect falsehoods and misinformation.

"ChatGPT, and tools like it, do not know, whether something is factual or not."
- Jan Vermeulen, Editor - MyBroadband 
"Sometimes it's great, but sometimes it will sell you up the river."
- Jan Vermeulen, Editor - MyBroadband 

So are tools like ChatGPT capable of postulating asks Ford?

"It can do a very, very good facsimile of one sometimes depending on the subject matter."
- Jan Vermeulen, Editor - MyBroadband 

Scroll up to audio player to listen to the interview