South Australian universities to allow use of artificial intelligence in assignments, if disclosed | South Australia


Universities should stop panicking and embrace students’ use of artificial intelligence, AI experts say.

South Australia’s three main universities have updated their policies to allow the use of AI as long as it is disclosed.

The advent of ChatGPT, a language processing chatbot that can produce very human-like words, sparked fears students would use it to write essays. Anti-plagiarism software wouldn’t pick it up because ChatGPT isn’t plagiarising anything, it’s producing new work in response to prompts from users.

Flinders University, the University of Adelaide and the University of South Australia have adjusted their policies to allow AI use under strict controls.

Flinders University’s deputy vice-chancellor, Prof Romy Lawson, said earlier this month they were concerned about “the emergence of increasingly sophisticated text generators … which appear capable of producing very convincing content and increasing the difficulty of detection”.

“[But] instead of banning students from using such programs, we aim to assist academic staff and students to use digital tools to support learning,” she said.

University of SA senior lecturer in education futures Vitomir Kovanovic said all universities should allow AI and teach students how to use it.

“Absolutely. They must. The alternative is the middle ages. Going to pen and paper,” he said.

“You cannot stop it and, even if you could, it’s a temporary solution. The next one you won’t be able to. It’s futile. And you shouldn’t be doing it, you should be teaching them how to use it – they’re going to use it in the workplace anyway.

“It’s like having a driving school but teaching people how to ride horses.”

He said that, in the short term, universities would update their policies, but in the long term they would need to change the way they assess students and integrate AI into the process.

He likened it to the introduction of calculators, which stopped maths students having to spend time on long division, which in turn allowed teachers to set more complicated assignments.

The Group of Eight – Australia’s eight leading universities – said it would make “greater use of pen and paper exams and tests” this year, but would ultimately redesign the way assessments are done to deal with AI.

Charles Darwin University AI expert Stefan Popenici, who has just published Artificial Intelligence and Learning Futures about higher education’s adoption of AI, said accepting the use of AI was “the only way”.

“This is going to be around, like it or not. So banning it is ridiculous,” he said, describing the SA universities’ move as “as step in the right direction”.

“There are many possibilities to use technology for good,” Popenici said. “This is what higher education should be all about. This is in front of us, we can use this to our advantage.

“There’s a crisis of literacy … people don’t know how to read and write, we should use any tool that’s available to us.”

Cheating on university assignments has been a hot topic recently, because of the pervasiveness of contract cheating. Contract cheating is where students buy bespoke assignments online, while AI such as ChatGPT is similar, but more easily accessible, cheaper, and without a human.

The University of Sydney specifically mentions using AI as a form of cheating, although a spokesperson said they would eventually need to teach students how to use it.

Universities Australia is working on updating its academic integrity guide, and meeting with experts to discuss the rise of AI and how to approach it.

The body’s acting chief executive, Peter Chesworth, said universities were “closely reviewing [policies and procedures] in light of technological advances” and emphasised that cheating was “never the answer”.

Cheating threatens the integrity and reputation of a university degree, and students caught doing the wrong thing can face serious consequences, he said.

Sally Brandon, an associate communications lecturer at Deakin University, has recently detected the use of bots in almost one-fifth of assessments, sparking concerns that the use of artificial technology to cheat in exams is widespread.

Last week, singer-songwriter Nick Cave dissected a song produced by ChatGPT “written in the style of Nick Cave”, calling it “bullshit” and “a grotesque mockery of what it is to be human”.



Source link

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *