In a recent blog post, Google announced that it was cracking down on spam and would be updating its algorithms both to target spammers, but also those who just copy original content (such as plagiarists and other garbage sites) and content farms that pump out a large amount of poor-quality but original content.
According to Google, there has been a slight uptick in spam caused by Google increasing its freshness and its improved spidering but they are working to address it and hope to have it under control soon.
But while the announcement was generally greeted with open arms, I, personally, am much more skeptical.
Though others may think me a cynic, especially when I tell them “I’ll believe it when I see it” when talking about the upcoming changes, I think I have good reason for being skeptical.
Though I sincerely hope Google is able to make a major dent in this fight and would like to be wrong, history is not on Google’s side nor is it on ours as content creators. Sadly, as serious as Google is in its announcement, it isn’t likely to change anything in a significant way and there are many reasons why.
A History Lesson
This isn’t the first time Google’s made a push against spam, plagiarists and other sites that they don’t want in the index. Google has already said that these upcoming changes are extensions of changes made in 2010. This was also the year that saw Google suing spammers (though that involved advertising spam).
But it wasn’t just 2010. In the past five years that I’ve watched Google deal with spam and plagiarism, there has been something of a cycle to it. Every so often, Google makes a major push on this front, sees some progress and then loses the gains over time, often not very long.
The cycle seems to have begun in ernest in 2006, shortly after a contest between it and Yahoo! to have the largest index. That is when Google started talking more about duplicate content issues and started doing purges of its database to remove spam, namely with its “Big Daddy” update.
However, there have been countless other advancements and changes as well. In 2007, for example, Google put more weight on domain age among other things in an effort to stop spam. That was also the year Google got a patent on a system to rank pages by phrase and they updated their spam reporting procedures.
Then, 2008 was the year of the Dewey update, which targeted spammy backlinks and prioritized new content. After that, 2009 was the Google Caffeine update which, in addition to the other changes, was also targeted at spam.
I can go on with the tweaks and pushes, but I think I’ve made my point. I even found one change in 2005, the Jagger update, that was designed to emphasize authority sites and target spam linking (similar to the 2008 update).
In short, the process of tweaking the algorithms against spam and plagiarism has been an ongoing one. The only thing that has really changed is that Google is now announcing the changes beforehand, rather than just making them and leaving SEOs to try and decipher what actually happened.
Other Reasons to Be Skeptical
If the nature of the ongoing war between Google and spammers/plagiarists doesn’t make the point, there are other reasons to be wary of a major victory as Google rolls out its changes.
Consider the following:
- They Already Do a Good Job: If someone plagiarizes your content and your site is reasonably established, odds are they either won’t rank or won’t rank well enough to compete with you 99% of the time or more. It’s the other 1% or less that’s the problem. Improving upon that in a significant way will be very difficult.
- Spam is a Numbers Game: Spammers know the above information well and focus on quantity, not quality. A better algorithm may just be a cause for spammers to put out more garbage to make up for what is filtered.