A new study published this week by plagiarism and originality detection service Turnitin looks at high schools and how the use of their service has effected the amount of plagiarism that is detected over time.
They did this by looking at the papers submitted to the service and counting the ones at that had more than 50% unoriginal content. Then, starting with the first “true usage start year”, which is a full calendar year that reflects a minimum of 10 percent of the lifetime submissions by the school (rather than just a year the service was being tested out), they created a baseline that they then compared to the years that came after it.
The results are striking, nationwide, there is an initial increate in the amount of plagiarism detected, rising a few percent over the course of the first two years of use. However, after that the rates begin to fall quickly and, by the end of the eighth year there’s a 33% drop in the level of unoriginal writing seen.
The results, which were broken apart by state, showed that 43 of the 50 states showed reduction in the amount of plagiarism detected over the time period.
Considering that most evidence indicates plagiarism is on the rise, the study definitely indicates that plagiarism detection software has an impact on student behavior. However, it also shows that not all uses of the software are equal and that, if a school wants to get the maximum benefit from a tool like Turnitin, it needs to be committed to it as more than just a quick fix.
The Basics of the Study
The basic concept of the study is fairly straightforward. Turnitin looked at the first full year of usage of the service and determined the percentage of papers that were unoriginal (papers with more than 50% unoriginal content). Then, using that percentage as a baseline, looked at the expected amount of unoriginal submissions in the future and then compared that to the actual amount found.
The study is careful not to provide any figures that directly compare the states to one another. It does this by not providing the baseline amount of plagiarism. Instead, it only provides the change against that baseline. Therefore, there’s no way with this data to determine which states have the highest or lowest amount of plagiarism, which would likely be unfair as Turnitin is not used evenly across the states.
Initially, 21 states saw an increase in unoriginal papers in the second year. This raised the national aggregate average to +3.6%, indicating an initial rise in the detection of unoriginal work. However, after that initial bump, the data began to trend downward, reaching an aggregate total of 33.4% by the end of year eight.
However, that reduction was not uniform. Some states, like Massachusetts, saw a massive drop in detection rates (83% in Massachusetts’ case). Deleware, however, saw a a steady increase in the amount of unoriginal papers detected, culminating in a 285% increase by year eight.
The results also fluctuated wildly in some cases. Alaska, for example, initially saw a 23.9% increase in detection but then it dropped to 25.4% below the baseline only to rocket up to 151% above it in two additional years and then drop back to 20% below it by year seven, the last year available in the state.
These fluctuations were most common in states with low populations, such as Alaska and, to a lesser degree, Montana, indicating they might be cause by a relatively small number of unoriginal essays throwing off the averages.
Larger states, such as California, Texas and New York, followed a much more consistent arc. Either rising initially and then falling or falling consistently through the time period.
In fact, all of the eight most populous states saw drops in the amount of unoriginal writing detected.
However, it’s the aggregate data that presents the clearest picture of the data and, there, schools saw an average reduction in unoriginal work of 5.6% per year.
But while the data is interesting, there are some limitations that do have to be kept in mind.
Concerns and Limitations
The biggest criticism of the study, other than its source and concerns about bias, is the fact that Turnitin failing to detect unoriginal work doesn’t mean that students are not cheating or plagiarizing, but rather, that they may have simply gotten better at hiding their behavior.
But while that may be true in some cases, it seems unlikely that it would justify the entire 33% drop. THough students certainly do seek out ways to circumvent plagiarism detection software, Turnitin is constantly following those techniques and building counter-measures to them
Since the tricks to defeat Turnitin come and go quickly, it seems unlikely that it could cause an, overall, steady decline in unoriginal work detected over eight years. Though students could be better at rewriting content, defeating it that way isn’t as simple as it seems. Turnitin, like most originally and plagiarism detection services, requires heavy rewriting to ensure it will pass inspection. To do that and make sure the work is high enough quality to get a good grade takes a great deal of work, often more than would be required to create an original one.
In short, it’s a path few students are likely to take.
Still, it doesn’t mean that students aren’t cheating on their essays in other ways, just that they aren’t using copy/paste plagiarism as much. Plagiarism detection tools can only do so much and it falls on the teachers to both make the best use of those tools and to prevent/detect other forms of unethical behavior.
Another limitation to the study is the lack of a control group, a selection of high schools not using Turnitin (or any other service), to see how their rates of unoriginal submissions changed over the year. This makes it difficult to draw any conclusions about the overall effectiveness as there is nothing to compare these results to.
But even with those concerns, there’s a great deal that schools and instructors can glean from these numbers, including some very useful information on what to expect when such tools are introduced to the classroom.
Lessons and Takeaways
For schools and instructors who are either looking at implementing plagiarism detection tools or have done so, there are a few things to keep in mind:
- Plagiarism Detection Software is a Long-Term Commitment: The impact these tools have is gradual. They require teachers learning how to incorporate them into their classrooms and learning how to use them as teaching tools.
- Not All Use is the Same: Some schools and states seem to see a drastic reduction in unoriginal papers submitted. Others don’t or even see increases. Without a study it’s difficult to know why, but it’s likely a product of how the software is used and integrated into the classroom.
- It Reduces, But Doesn’t Stop Unoriginal Work: Finally, even in the greatest success stories, such as Massachusetts, there were still unoriginal papers being submitted. While they were far fewer in number, some students still turned in unoriginal work, proving that, even under ideal conditions, there’s no way to completely eliminate the problem.
All in all, teachers need to remember that plagiarism detection tools are just that, tools. They aren’t a replacement for human judgment and they can’t fix the problem for you. They simply help point out similar text and highlight likely cases of plagiarism.
They can be a significant help, both in detecting unethical behavior and in teaching good writing skills, but they can’t do it alone.
In the end, Turnitin’s study is very interesting and it makes a good case for the use of plagiarism detection tools, including their own, when used correctly.
The most important thing, however, is to continue to teach on the subject of plagiarism and citation. Creating a climate of fear around the topic doesn’t help students learn nor does it do anything other than encourage better cheating.
Instead, it’s better to use the tools not as a means to catch cheaters, but as a means to educate and help students understand how to work with outside content and cite sources correctly.
While that may not be the use many think of when considering adopting such software, it’s the use that does the best service to the students.
Disclosure: I am a paid conslutant for iThenticate, a company that is owned by iParadigms, which also own Turnitin.