AI-enabled text-generating programs are here to stay; students should be guided to use such tools responsibly and ethically
ChatGPT and similar tools are here to stay, and it is futile to try to prevent their use. The best course of action is to guide university students to use such AI-enabled text-generation programs responsibly and ethically, according to an internal paper prepared by the American University of Ras Al Khaimah (AURAK).
As ChatGPT and several newer AI writers are predicted to disrupt the academic world in particular, the study takes the sober view that it is best to embrace these tools and integrate them into teaching practices.
The study acknowledges that ChatGPT poses a serious threat to academic integrity, but also underlines ChatGPT’s potential to be a helpful instructional tool under faculty supervision, with several do’s and don’ts.
Prof. Stephen Wilhite, Senior Vice President for Academic Affairs and Student Success/ Provost, AURAK, said: “It is very obvious that ChatGPT and other AI writers are set to have a profound impact on higher education. But apart from the obvious negative aspects, AI writers also have some intrinsic positive elements which can be used effectively to students’ advantage. The challenge is to find a win-win solution.”
In an experiment to assess the gravity of the situation, AURAK’s IT Department conducted an experiment. Staff members generated an essay on Climate Change using OpenAI’s ChatGPT. Then they ran a plagiarism test using the SafeAssign tool embedded in AURAK’s LMS Blackboard Learn. The result showed 25% of the essay content was copied or matched a source from the Internet. Further, test of demo version of GPT-2 Output Detector showed 100% match with ChatGPT produced content. However, detection of plagiarism was greatly reduced when ChatGPT content was paraphrased by Qillbot.
The rising popularity of ChatGPT and other AI writers underlines the need to have supporting faculty to maintain academic integrity, while incorporating appropriate use of AI tools.
According to Professor Stephen, there is a need to revise the Academic Integrity/Statement on Plagiarism sections of the syllabus template. He recommends universities make it explicit that use of information generated by ChatGPT and other AI writers without identifying the source of the information would constitute plagiarism, which is a violation of the Honor Code and the Student Academic Integrity Policy.
Says Professor Stephen: “Anti-plagiarism software is not good at detecting ChatGPT-generated content. Providers of such software (e.g., SafeAssign, Turnitin, etc.) are endeavoring to improve detection. Also, ChatGPT has developed its own tool for detecting ChatGPT-generated content – GPT-2 Output Detector, and other ‘detectors’ are also being marketed (e.g., GPTZero). However, use of other paraphrasing software (e.g., Quillbot) greatly reduces detection of ChatGPT-generated content.”
The AURAK study recommends changes in assignments and assessments in educational settings. For example, in written assignments, students could be asked to connect personal experience or events from the class to course concepts, as the AI writer does not have access to personal experiences or class events. They could be asked to pair short written submissions with oral, in-class questioning about the submission.
Another recommendation is to have writing occur in-class with a zero tolerance policy for possession of any electronic devices during such writing exercises – “flipping” classes with reading, viewing of lectures, videos, etc., occurring at home but with writing about the material occurring in class.
Further, if written assignments are to be completed outside class, universities should collect an in-class sample of students’ writing as a “baseline” against which written assignments completed outside class can be compared. However, AI writers will increasingly be able to mimic the writing style of users if provided with a sufficient sample of the user’s writing.
The study recommends greater vigilance when it comes to examinations, such as having multiple, trained proctors present based on number of students being tested and having all electronic devices turned off and stored at entry to the exam room.
Despite the challenges, there are positive educational benefits of ChatGPT – it can be used as a helpful instructional tool under faculty supervision and as a tool to promote information literacy. Graduates will increasingly be expected by employers to use AI writers in the workplace, so gaining familiarity with them and how to use them responsibly and ethically while in university, will better prepare students for their work after graduation.