iParadigms Accuses iPlagiarismCheck of Abuse
Update: See footer of article for iPlagiarismCheck’s response.
When Dr. Debora Weber-Wulff, a professor at the University of Applied Sciences in Berlin, set out to test and compare the top plagiarism detection tools available, she never expected to catch one of the programs in a lie.
However, it appears that is exactly what she did.
As she received her results, she noticed that the originality reports from iPlagiarismCheck closely mirrored those from iParadigms’ Turnitin Product. When Weber-Wulff approached iTheticate with the similarities, they researched the matter and released a statement (PDF) accusing iPlagiarismCheck of using the Turnitin service illegally.
It is the turning point in a very strange story that has been going on behind the scenes for some time here and it finally appears to be at a point where I can share what happened.
The Story
A few weeks after my first post about iPlagiarismCheck, Weber-Wulff approached me to ask me what I knew about the company. By this point I had already noticed the plagiarized image on the iPlagiarismCheck Web site (See notes in original article), and had some concerns about the company. However, nothing was definite on my end and I certainly didn’t have enough evidence to say anything publicly.
The two of us swapped notes and she told me about her work. She was comparing the various plagiarism checking services by submitting a wide variety of papers known to contain plagiarized content to see how well they detected the copying. At this point, she had just completed her second round of testing and was working on her actual report. However, she had noticed that iPlagiarismCheck and Turnitin produced almost identical results when used on the same papers. Unnerved by this, she had approached me for more information.
To further test the similarities, we took a lengthy paper that I had submitted through iPlagiarismCheck and pushed it through Turnitin. The results were almost identical despite the complicated work. However, there were some differences in the order of the sources and the percentages listed, it could have either been caused by two similar algorithms pulling from the same source or one algorithm interpreting the same results slightly differently.
Though the results were worrisome, they weren’t definitive enough to draw conclusions, at least by themselves, and I was repeatedly reassured by iPlagiarismCheck that their system was separate and that the similarities were the result of the use of the same sources.
With little hard evidence and nothing to go on, I simply stopped actively promoting iPlagiarismCheck and decided to wait and see what happened.
(Note: You can find much of this story documented in her “Strange History” update about iPlagiarismCheck (Note: The Link points to a translated copy, the original is in German))
iParadigms Responds
Before releasing her results to the public, Weber-Wulff gave all of the companies involved a chance to respond to the findings. Yesterday, iThenticate submitted their response (English PDF) in a four page document mostly focused on issues they had with the actual results.
However, in the last page of the report, iParadigms addressed the similarities between it and iPlagiarsmCheck by saying the following:
“In making a check of suspicious Turnitin activity, it was uncovered that two individuals by the names of Angela Nevarez and Susan Keisler have been using the Turnitin system in an unauthorized manner. These individuals seem to have found a Turnitin account id and join password that had been published on a university website for use by instructors at the university most likely through the use of a search engine.
The individuals used this information to create classes and assignments to which they submitted papers in order to generate Originality Reports. The individuals deleted their submissions shortly after each Originality Report was generated. Deletion of a paper from an assignment does not remove the paper from Turnitin’s student paper database and does leave a method of tracking the submission. The account administrator at the university has been contacted and has confirmed that the individuals in question were not authorized users of the account. The individuals’ false user accounts have been blocked from the Turnitin system and the institution’s account information has been changed to prevent further misuse.”
If this information is to be believed, then the reason for the similarities is because iPlagiarismCheck was using a Turnitin account without permission to produce their results. That, if true, would be a violation of several laws and could, in theory, subject iPlagiarismCheck to both civil and criminal penalties.
It also indicates that, if their changes were successful, that iPlagiarismCheck is no longer functional and is unable to process works. I have not tested the service since this announcement was made. I have contacted iPlagiaismCheck at the account I had available but had not heard back from them as of this writing.
I will update this article should I hear anything.
Other Results
Though the revelations regarding iPlagairismCheck are the most explosive element of the report, the other results bear some great fruit as well.
The big surprise in the results is that Turnitin did not fare as well as many of the other tools. Though the largest name and the most popular service tested, it fell behind other services, including Copyscape, which confirmed that it can detect academic plagiarism very accurately, and several foreign companies.
All in all, it was the European services that ruled the day. Ephorus and Docoloc (Free demo) took the one and two slots with Ephorus being the only service to get above an “acceptable” rating.
However, several popular services were not tested including Mydropbox and SafeAssign, which are sister services, due to either problems with the service or issues in establishing contact.
In the end, it is a very interesting study and one that is a worthy read. Google Translate seems to do a decent job with most of the pages, making them at least intelligible to an English speaker.
A Personal Note
The truth is that I am very upset about this incident. iPlagiarismCheck came highly recommended to me by good friends of mine. If these allegation turn out to be true, then myself and many people I work with have been tricked and outright lied to by this company.
I have spoken on many occasions with the people at iPlagiarismCheck and have asked them point blank about this issue. They have repeatedly denied it. Though I’ve had suspicious about them since I noticed the plagiarized image, they were just that, suspicions. Up until yesterday, I never had any hard evidence to back those gut feelings and, since they provided a useful and necessary service, I didn’t feel it appropriate to completely stop promoting them.
The response from iParadigms is the first solid evidence that something is truly amiss at the company.
In that regard, my only regret in this situation is that I promoted a company that was, according to iParadigms, engaging in illegal behavior. I sincerely apologize to anyone who reads this site and may have used iPlagiarismCheck in part due to my reviews and recommendations. Those reviews were, apparently, misguided.
I am sorry.
However, my reviews of iPlagiarismCheck were based upon their ability to provide great results at an unbeatable price. That is certainly true, it just appears that they were able to provide all of that due to unlicensed access to Turnitin’s service. I am going back and updating my previous reviews of the service to make light of this controversy.
On that note, anyone who is interested in another recommendation might want to take a look at getting an individual subscription at MyDropBox. Though it is double the price of iPlagiarismCheck’s unlimited service, providing only six months instead of a year for roughly the same price, they have a very good reputation among academics. Though they were not tested in Weber-Wulff ‘s report, I have seen the results myself from other professors I know and they have proved to be at least comparable to other services.
Also, Copyscape performed very well in the study and is working to improve its detection of mass plagiarism. Its premium service, at five cents per search, may be the best bang for the buck for those needing to do quick checks and don’t have a large volume of works to inspect.
All in all, there are options and it looks like it is probably time to start seeking them out.
Conclusions
Personal feelings aside, this is still a very dark day for plagiarism detection. iPlagiarismCheck was filling an obvious need by providing a low-cost, easy to use academic-style plagiarism detection service. By making available one-off submissions and having reasonable rates for unlimited subscriptions, iPlagiarismCheck was expanding the reach of such services beyond schools, universities and major corporations and into the hands of bloggers and small Website owners.
As legally dubious as the service might have been, iPlagiarismCheck was filling a need and it would be nice to see another company step forward and fill it again. The question is though whether or not it is possible to provide access to these services, including Internet cache and article databases, at the costs needed to make them accessible to everyday Internet users.
Let’s hope that iPlagiarismCheck is just a speed bump on a way to a better tomorrow. With so many companies vying to enter this field, it looks very likely to be the case.
Update:
A few hours after this article was posted, I received a reply from Susan Keisler. I am going to post her entire response below, verbatim. without any commentary.
“Thank you for brining this to our attention, and I apologize for reverting a bit late. Needless to say after reading through TurnItIn’s accusation of us “Stealing” their service we have contacted them on a stern note to resolve the matter. I will let you know more on this as soon as I hear back from them.
As far as ‘HOW’ this came about, we think it’s as follows:
Last two months we had a string of fraudulent orders on our site, which were discovered, suffice it to say we dealt with the submitters’ sternly, instead of just letting it go (as fraud is unfortunately a part & parcel it seems of e-Commerce), simply because running our system, for plagiarism checking without even the compensation of the mediocre amount we charge translates into wasted resources and ultimately jacking up of the prices to cover costs, which we simply do not want to do. We want to remain people-oriented and within the average customers’ budget range.
The two names mentioned by TurnItIn, Susan Keisler and Angela Nevarez are both publicly published. For example, as you may be aware, my name, Susan Keisler, is the name used in all outgoing correspondence, while that of Angela, who registered our domain name, is listed in the WhoIs database, hence both names are easily accessible and subject to abuse. Which is the case here.
Dr. Debora, if you read the report, did her research rather arbitrarily, relying on sources which may or may not be credible, as even the TurnItIn admins point out. She did not give us a chance to relay our side of the story and directly approached TurnItIn, needless to see a one-sided, mis-conception was formed.
As soon as I hear back from TurnItIn and resolve the matter with them, I will update you promptly.
Sincerely,
-Susan keisler”
I will, of course, post any updates as they come.
Want to Reuse or Republish this Content?
If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.