Essay Checkers, AI, and Plagiarism

Destiny Akinode
December 8th, 2025

What is an essay checker?

In the past, “essay checker” has been a term used for tools that analyze a text for spelling, grammar, and vocabulary mistakes. They might also have been used to check for problems with citations, as seen in Recite. Turnitin, a popular educational tool, has been a great help in checking for plagiarism.

Currently, the AI boom has pushed educators to consider checkers that scan for AI-generated content.

Are AI paper checkers accurate?

High false positive rates (FPRs) are a major concern for educators. Presenting a claim of academic dishonesty can be risky as making that claim based on a false positive can be detrimental to a teacher-student relationship.

However, Pangram’s FPR of 0.01% makes it a reliable detector for use in higher education. This has also been corroborated by third parties, which conclude that Pangram is the most accurate commercially available AI detector.

AI Essay Checker vs Plagiarism Checker

An AI essay checker notes the use of AI in a particular text. Plagiarism checkers determine the use of external material in a text.

Pangram is a commercial tool able to detect both AI-generated and plagiarized content. Pangram notes both as separate cases when analyzing text.

Can AI Plagiarize Human Work?

A plagiarism checker compares an essay to previously existing texts to match for similarities. An AI essay checker scans a text for language patterns similar to current AI models.

LLMs like ChatGPT generate text using large datasets and have been known to plagiarize from this data. In a journal by Amy B. Cyphert, Associate Professor at the West Virginia University College of Law, notes that:

“Research suggests that the more often copyrighted material is included in a dataset, the more likely it is that an LLM will produce an output that includes unaltered text from that work.”

Memorization happens when an LLM reproduces phrases from text in the training set. Intentionally or otherwise, LLMs can be prompted to reproduce the work of others in large portions. This means that the final result of prompting may be plagiarized in part or entirely.

While researchers continue to debate on a more nuanced definition of plagiarism, academic integrity policies at universities state that plagiarism is the use of someone else’s work without crediting them. University of Michigan notes that plagiarism can be intentional or unintentional. Oxford University details several forms of plagiarism to include unacknowledged verbatim quotation, copy-pasting from the internet, paraphrasing, collusion, failure to acknowledge assistance, etc.

So in an academic institution:

When a student submits AI-generated work, they may be turning in plagiarized human-written work as well!

Can Humans Plagiarize AI-Generated Work?

By some standards, maybe!

When a student submits work generated by AI without crediting the AI, academic dishonesty comes into motion. The University of South Florida’s policy states:

“Since AI chatbots and other generative AI tools generate new text and images in response to prompts, using material from generative AI could be considered more like ghost writing than plagiarism.”

The Committee on Publication Ethics notes that AI cannot be considered an author in a research paper because “AI tools cannot meet the requirements for authorship as they cannot take responsibility for the submitted work. As non-legal entities, they cannot assert the presence or absence of conflicts of interest nor manage copyright and license agreements.”

Some academic institutions believe use of AI-generated content definitely constitutes plagiarism:

“Plagiarism is the act of presenting ideas, research or writing that is not your own as your own. Examples of plagiarism include: Copying another person’s or an AI tool’s actual words or images without the use of quotation marks and citations attributing the words to their source. - City University of New York

“Plagiarism is when you take someone else's ideas and portray them as your own, even unintentionally. If you are using AI to summarize an idea or write your work, it's not 100% your work.” - Ohio University

And in other cases, educators are shouldered with the responsibility of creating AI policies such as:

“Clear guidelines for individual courses or assignments enables students to focus on the learning objectives you want them to achieve. Because students are often not sure when or if they can use Generative AI, policies can be most helpful when they are tailored to your specific course and assignments, since some uses of AI might support student learning, but other uses would interrupt important processes students need to work through on their own.” - UPenn

“Creating clear policies around AI use is essential to maintaining academic integrity and setting student expectations. These policies should outline when and how AI tools like ChatGPT, Perplexity.ai, or Grammarly can be used and ensure students understand the ethical implications of AI in academic work.” - NYU

From this small sample of university policies, we can understand the varying ways institutions view AI usage and plagiarism. While AI is a tool and not an author, presenting AI-generated output without acknowledgement is dishonest. Educators should set clear expectations for their students. To learn more about enforceable course policies, Professor Christopher Ostro provided a nuanced approach to addressing AI cheating.

Pangram takes a pro-transparency approach to AI use and the use of external content, especially in academia. If you are interested in our model, try out Pangram’s AI and plagiarism detection checkers for free!

Subscribe to our newsletter
We share monthly updates on our AI detection research.