For almost as long as there’s been a YouTube, there’s been spam on it.
Traditionally this spam has taken the format of garbage accounts uploading misleading videos, often with fake thumbnails, for the purpose of promoting products, services or some cause.
But while that type of spam still certainly exists on YouTube, it’s now being joined by a new kind of spam, automated videos that plagiarize content from blogs, news sites and other text sources.
For the spammer, this is a very easy way to flood YouTube with a large number of low-quality topical videos. The result for content creators, especially those who produce text or image content, is that your hard work is being used to fuel spam videoblogs and those spammers will have an upper hand in search results because of the way Google shows preference to YouTube in its algorithm.
This raises two difficult questions: What can YouTube do to battle this problem? And what can creators do to protect their work?
Understanding YouTube’s Plagiarism Bots
While there are slight variations on the theme, the basic elements of the videos are the same:
- Plagiarized Image(s): The videos use either a still image, a small looping clip in the background or a series of images with an automated transition. In every case, the visuals are taken from another source without permission or attribution.
- Plagiarized Text: The “script” of the video is then copy/pasted from a third party site, where it is then fed through an automatic text-to-speech translator that reads it in an awkward and robotic manner.
- Very Active Channels: The channels engaging in this behavior tend to be very active when they are posting, publishing at least several videos a day.
For example of this, you only need to look at this this YouTube videoesvdveav (nofollowed) and compare it to this article on the New York Post, looking in particular at the third and fourth paragraphs.
As you can see in the video, it is basically an automated voice doing a near word-for-word reading of the text article. If you then visit the channel this video was uploaded to (nofollowed), you’ll see that it has hundreds of videos all uploaded at around the same time, about three weeks ago.
The videos themselves are very representative of how these channels work. Typically, the videos themselves are very short, usually between 30 seconds to two minutes, focus heavily on current events and copy the headline of the article they are “reading” from.
And there are countless channels like these. In just a few moments of searching, I found this one (nofollowed), this one (nofollowed) and this one (also nofollowed) and I can point to at least a dozen more.
The question becomes why are spammers uploading these videos? The answer isn’t as straightforward as some may believe.
The Why of YouTube Spam
Author Geoleo at Snapzu noticed this issue seven months ago. He hypothesized that, even though most of the videos were very unpopular, generating at most a few dozen views, that it was a direct spam effort because some of the videos do take off and get a lot of attention.
And there is some proof to that. After all, the first video I linked to I discovered on Reddit, where it was #5 on the home page. It has received over 180,000 views as of this writing. Some of the previous channels, including one he cites, have received a large number of direct views.
However, it’s unlikely that the aim is to directly profit from the videos, at least not in most cases. The reason is because most of the channels I examined had no ads before, on or around the videos. Furthermore, if these channels joined YouTube’s partner program, they would inevitably attract more attention and almost certainly be shut down.
Odds are, most of these channels are part of a much larger and much more complex spam operation.
On many sites on the Web, you can buy YouTube “subscribers” much like you can by Facebook likes. However, as with the Facebook like scam, these individuals don’t simply promote your channel until it gets legitimate subscribers, instead, it has an army of fake accounts that subscribe to you.
However, as the video above explains, those accounts don’t just like (or in this case subscribe) to the users that pay. That would make the scam obvious to Facebook and YouTube so these accounts subscribe to a combination of legitimate accounts, accounts within their own network and a handful of people who likely paid.
Take a look at the Subscribed list for this channel (nofollowed) and you’ll see what I mean.
For these types of enterprises, it’s crucial that the accounts they use appear legitimate and stay active (at least to the automated triggers that watch them) and they do so by uploading lots of videos that are relevant to popular keywords.
This subscription network can also be used to boost non-spam channels. Even if they aren’t doing it for pay directly, they likely have other channels that they are subscribing to and promoting through this, channels that are more legitimate and are part of the Partner program.
Either way, these spam channels are not the end goal, but the means to the end and it’s an end that content creators and YouTube both have an interest in stopping.
What Can Be Done?
For content creators, this type of spam poses a very unique challenge. While these videos are clearly garbage and their view counts reflect that (in most cases), Google and other search engines put a great deal of preference on YouTube videos and that can cause them to rank better than the content they’re “reading” from.
Currently, most of the content is pulled from major news organizations so there’s little for the average blogger to worry about. However, it’s likely still worthwhile to check a few of the titles of your posts to see if they’ve been made into automated videos, especially if you write about current events.
YouTube, however, has a much more difficult challenge ahead of it. The same as the Facebook “Like” economy, YouTube has created a subscription economy and people are rushing to game it. In this end, this is likely just one strategy being applied.
YouTube needs to learn to spot these types of sites and shut them down proactively, not just because of the copyright infringement, but because of the flagrant spam.
While most of these videos aren’t getting a lot of views, they are still part of a much larger and more dangerous spam operation that YouTube should be tackling as to avoid being outright overrun with junk content.
After all, these videos are being automatically generated and uploaded. They can produce far more content in a single day than any legitimate creator (at least one without an army of people).
Considering that I first saw one of these videos on Reddit and Geoleo noticed it on his video home page, it’s clear that the spam is starting to seep through. (Note: The video that Geoleo noticed seven months ago is still active.)
It’s only a matter of time before the dam breaks and this becomes a much bigger problem than it already is, for YouTube and other creators alike.
The problem is very clear: YouTube has a very serious spam problem and it’s already affecting legitimate content creators. It should have been the New York Post, not “Daniel Chan’s” YouTube video, that was featured on Reddit. It should be legitimate uploaded content featured in the video home page and it should be the original news sources featured in search engines.
YouTube, like Facebook, needs to crack this issue. But with YouTube, the stakes are much higher, especially for other content creators. While Facebook spam means that legitimate Facebook pages get almost no traffic, YouTube’s problem can be deeply harmful to creators who aren’t anywhere on YouTube’s platform.
YouTube and content creators have been coming to blows a great deal lately over disagreements over payouts and contract issues, this seems like a battle where creators and YouTube should be on the same side.
So if YouTube wants to prove they are a friend of the creator, this is a good place to start, both to prevent their site from being overrun with spam, and from legitimate creators having to compete with automated garbage.