Context
- Artificial Intelligence (AI) tools (such as ChatGPT) are now readily available and can simulate human-written text in a variety of contexts (essays, summaries, lists, computer codes, etc.), as well as generate images.
- AI tools are rapidly increasing in availability and sophistication.
- Tools that claim to be able to detect AI-generated text or images are unreliable.
- AI detection tools are subject to false positives, particularly if human-written text is first corrected using digital grammar checkers.
- AI detection tools are subject to false negatives, such as when AI-written text is altered or if AI tools are told to write text that does not appear to be written by AI tools.
- AI detection tools, such as the one used by Turnitin (which is available through Canvas), cannot explain the algorithmic logic behind a determination that a sample of text was written by AI. This is in stark contrast to plagiarism detection tools (such as that of Turnitin) that can match the text of a student assignment to sources available on the internet. Unlike Turnitin’s plagiarism detection score, Turnitin’s AI detection score is only viewable by the instructor—students will not be able to see if their work is flagged.
- The necessity of upholding the integrity of our degrees and the learning experience for our students must be balanced with the harm caused by false accusations. Falsely accusing a student of an academic integrity violation can be damaging to the mental health of a student as well as damaging to the relationship between an instructor and the students of the class. Therefore, given the above-mentioned concerns about AI detection tools, recommendations for handling suspected incidents of academic misconduct involving AI are included below.
Recommendations
- Instructors should communicate clearly with students through the course syllabus, and reiterate, where relevant, within individual assignment guidelines:
- Whether the use of AI tools for class work is permitted or not permitted. If permitted, describe the contexts or provide examples for students to use as guidelines (such as creating outlines, summarizing concepts, responding to discussion posts, revising titles, summarizing content, generating code, creating outlines, and individual tutoring purposes). The course policy statement should include the instructor’s expectations for the use of grammar checkers and paraphrasing tools (such as Grammarly and Quillbot).
- If relevant, how the instructor wishes AI tools to be cited, or their use described within student work.
- Whether the instructor intends to use AI detection tools to identify work that may be in violation of course policies.
- Whether the instructor may request that students show evidence of their work process (such as notes or earlier versions of final work) should they suspect inappropriate use of AI tools.
- Instructors should not base academic integrity accusations that students have used AI tools inappropriately in assignments solely upon the results of any AI detection tool such as the Turnitin AI score, given the current prevalence of false positives in detection technology. Additional lines of evidence* could include but are not limited to:
- Strong similarity of submitted work to the output of AI tools when the assignment prompt is entered by the instructor
- Inclusion of fake/non-existent references, quotes, or other details in submitted work
- Results of a discussion with the student in which they were unable to reiterate the theme or argument of the work or identify the source of information contained in the work
- Terms referenced that have no connection to the subject/object of analysis at hand
- Topic sentences are reused in several paragraphs and sentences contain logic that is circuitous or never makes a definitive point
- “Awkward repetition” of specific phrases or words throughout the text
- We strongly recommend that instructors familiarize themselves (e.g. ChatGPT, Microsoft Bing, or Google Bard) and the text it generates when given course assignment prompts.
- Instructors should re-evaluate their course assignments and assessments in the context of the current availability and rapidly increasing sophistication of AI tools; the CTL has resources (see below) to support instructors in doing so.
- Where AI tools can be used to enhance student learning and/or prepare students for effective use of these tools in their future careers, we encourage incorporation of AI tools in coursework.
- Where use of AI tools would undermine or shortcut the learning goals of coursework, we recommend that instructors modify their assessments as needed to minimize the effectiveness of AI in constructing responses to assignment prompts or questions.
*Note: Many of these features are not unique to AI-generated text, and may become less common in AI-generated text as the technology develops.
Resources
The CTL is maintaining and updating online resources on AI tools in teaching and learning, as well as hosting workshops to help instructors address the use of these tools in their courses.