Copyscape Tops Plagiarism Checker Testing

Dr. Debora Weber-Wulff, a professor at the University of Applied Sciences in Berlin, has announced the results of her 2008 plagiarism software test (in German, translation) and, in something of an upset, Copyscape Premium takes the top spot.

Copyscape, which finished third in the previous testing, had its premium service rated first this year with its free service taking the third slot. Both version handily beat out popular academic plagiarism checkers such as Turnitin and SafeAssign.

Though the results may come as a shock to many in the academic community, especially those that pay for accounts at services targeted at universities, those that have followed Dr. Weber-Wulff’s work will likely not be surprised at all. Many will remember her work on exposing iPlagiarismCheck following the previous rounds of testing.

Nonetheless, her study offers a great deal for those interested in plagiarism to think about.

Methodology

To conduct the tests, Dr. Weber-Wulff, along with her team, compiled 31 papers with known amounts of plagiarism and ran them through each of the applications and then rated their effectiveness on a scale of 0-3. They then rated their usability against a set of criteria, including Website of the company and cost transparency, the page layout and captions, workflow and navigation, on a similar scale. The points were then totaled up and ranked on the following scale.

  • Very good: 72-80
  • Good: 60-71
  • Satisfactory: 48-59
  • Sufficient: 40-47
  • Poor: 0-39

Also listed in the results were cancelled tests, meaning tests that could not be completed for some reason, and products that could not be easily compared against other plagiarism checkers, such as those designed to check medical databases.

Results

In the end, none of the tests came away with a “Very Good” rating. However, Copyscape Premium did come very close, earning 70 out of a needed 72 points. Copyscape Premium earned the maximum number of points in a total of 23 out of the 31 tests.

Plagiarism Detector, another relative unknown, came in second, sandwiched between the two versions of Copyscape. Ephorus, last year’s top application, came in seventh this year, at the very bottom of the “Good” systems.

SafeAssign was number 8, coming in at the top of the “Satisfactory” systems and Turnitin was at the top of the “Adequate” list at 13.

However, several systems were not tested this year. Article Checker and CheckforPlagiarism were aborted, the first not fitting the criteria for a plagiarism checker and the latter not allowing free access for testing purposes.

Also, several commercial services were not slated for testing at all, including Attributor and iCopyright.

Caveats

Though the results of the test certainly should give Copyscape’s developers a reason to cheer, there are several caveats, in addition to the services that were tested, that are worth mentioning.

First, since the test papers were in German, it remains to be seen how well the results would translate to English and other languages. Though I wouldn’t anticipate too much difference since the languages are similar, it could have a bearing on the results, especially with some of the more complex tools that may be language-specific.

Second, the usability tests were geared more toward teachers than they were Webmasters. The test is designed for the academic arena so this makes sense, but as Webmaster, it means that the usability results need to looked at in a slightly different light as our workflow needs are different.

Third, once again, it is important for Webmasters to realize that the results are geared for academic plagiarism, where it is important to trace one plagiarized work to its source, not one source to its many plagiarized copies. Though the search techniques are largely the same, many of the apps best geared at finding academic plagiarism don’t always detect all of the potential sources.

Finally, it is important to remember that the study itself is not an endorsement of any one plagiarism checking system nor is it a recommendation, it is only an analysis of how the various systems did on these specific tests, nothing more.

Conclusions

Though the results are interesting, what I glean most from this comparison is that, even though the standards were lowered for this round of tests, none of the services were able to achieve “Very Good”. This means that even the best plagiarism detection tools have flaws and need improvement.

Personally, I’ve never relied on just one means of detecting plagiarism, instead pulling my information from multiple sources. This study seems to reinforce that idea and further encourages me to back up any system I choose with alternates.

The simple fact is that no plagiarism system is going to be 100% effective and, as such, it makes sense to double up your net, especially since so many of the services are both free and easy to use.

Want to Reuse or Republish this Content?

If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.

Click Here to Get Permission for Free