Plagiarism Detection with Generative AI Tools


UM-GPT, ChatGPT, any Generative AI tool, Canvas


Faculty and staff (including GSIs) seeking information regarding plagiarism detection tools.  Examples include iThenticate, Turnitin, and Vericite.


U-M does not currently provide a campus wide plagiarism checking system in the teaching and learning context. Use of these tools has not gained broad support on campus, thus the lack of a funded campus-wide offering. Typically, instructors engage students in education about plagiarism and use an honor code. In February 2024, the Office of the Vice President for Research (OVPR) announced that it had licensed the iThenticate tool for researchers to assess their own work for originality, but this tool is not to be used for evaluating student work. OVPR is working diligently to expand the University of Michigan's institutional license for iThenticate. Once all of the details are finalized, they will share this important information broadly with the U-M research community.

Detection tools are imperfect, and currently no software is able to detect AI-generated text with 100% certainty. AI detection tools can be used to identify situations where further inquiry into the use of AI-generated text may be needed. They should not be considered a definitive measure for cheating.

Language models generalize and summarize existing knowledge based on probability predictions of word sequences rather than copying it verbatim, and so it may be impossible to identify their use with certainty. While detection tools like Turnitin or GPTZero may report probability of AI authorship, they are easily circumvented and cannot provide definitive proof of cheating.  False positives and negatives are possible, and even likely.

U-M does not recommend the use of AI-detection technology at this time given their high error rate.

Additional Information

Proper citation

Need additional information or assistance? Contact the ITS Service Center.


Print Article


Article ID: 11803
Thu 3/14/24 2:50 PM
Thu 3/21/24 3:11 PM