Guest Post: Bad Bots and the Threat They Pose to Digital Publishing
Note: Today’s column is a guest post from Distil Networks, a content delivery network (CDN) that also defends against bad bots, including scrapers and hacking attempts. Click here for previous Distil coverage.
Content theft is always a threat to publishers, whether they are print-based or web-based. Unfortunately, though, digital publishers are much more vulnerable to this type of theft. Why? First, because digital content is much easier to access; all it takes is a little copying and pasting, and an article can be stolen, posted on another site and plagiarized in a matter of minutes.
However, there’s an even bigger threat at large in the online publishing world, one that’s even more prevalent than intentional content theft: bad bots.
Malicious bots and web scrapers threaten the landscape of digital publishing on a daily basis. They scour the Internet, looking for content they can copy, sell or pass off as their own. These crawlers can damage a publisher’s reputation, revenue, SEO rankings, and just about everything in between.
The Problem of Bad Bots
The obvious problem at hand is copyright infringement. When bad bots and scrapers steal a publisher’s proprietary content and pass it off as their own, they’re committing intellectual property theft. Unfortunately, unless you have a lawyer on staff, there’s really no way to police this effectively.
On top of the infringement aspect, content theft also causes a plethora of other problems and issues for a publisher. First, it can bog down your site. Bots use up a substantial amount of bandwidth scraping your pages and stealing data. This slows down page loading times significantly. This can frustrate your real human visitors (many of whom will leave a site that takes more than 2-3 seconds to load), and it can also hurt search engine rankings.
When content is stolen and posted elsewhere, it also creates what Google considers “duplicate content.” This is a serious offense in the eyes of search engines, and they will penalize a site for it. Your rankings will drop, and you may see a significant decrease in traffic because of it.
Bad bots can also cost digital publishers quite a bit of money. If you want to fight copyright infringement, you’ll need to hire an attorney or keep one on retainer. You may also have to foot the bill for higher server and bandwidth costs, as bots and scrapers use up a significant amount of both. To top it all off, you’ll see fewer sales and revenues, because your traffic isn’t real, your reputation is sullied, your site doesn’t work as it should, and most importantly, you’ve all but disappeared from search engine results.
How Bad Bots Affected One Digital Publisher
Let’s look at an example of just how serious the bad bot problem is. Distil Networks recently had a client called CanadaOne, an online publication and resource center for businesses and professionals in Canada. The site is home to thousands of articles, news items, press releases and business listings, as well as helpful information on franchising, innovating and running your own business.
Though CanadaOne had been around for more than a decade, the company was suffering. For years, they had fallen victim to web scrapers and content thieves, and they were spending tons of money trying to track down copyright offenders and bad bots. Unfortunately, when Google released the Panda update in 2011, things got even worse.
Some of CanadaOne’s original content was actually marked as stolen. This not only meant that other websites were receiving credit (and search engine rankings) for CanadaOne’s proprietary content, but it also caused CanadaOne to be penalized by Google, dropping its rankings significantly.
On top of all this, web scrapers were slowing down CanadaOne’s website, and they were accounting for a whopping 20% of all visits ” both of which affected the site’s SEO rankings as a result. CanadaOne turned to Distil Networks for help, and we were able to use our bad bot technology to detect incoming bots, prevent attacks and protect their content from future theft.
Almost immediately, CanadaOne saw results. Those bad bot visits stopped completely, speeding up page load times and improving SEO rankings significantly. Unique visitor traffic increased by 48%, and the company was able to cut server requests and bandwidth usage by 70% and 42%, respectively.
CanadaOne is just one of the many examples of how seriously bad bots can threaten digital publishers. Malicious crawlers can cut hurt web performance, decrease SEO rankings, steal proprietary content and cost publishers valuable resources and money over time.
Fighting the Problem
When it comes to problems in the digital publishing world, bad bots are about as bad as they come. They can hurt your business on all levels, and they can take away a serious amount of cash in the process. Fortunately, there are ways to fight back. Companies like Distil Networks have the technology and resources to detect bots, prevent them from accessing your site, and ensure your content is never stolen or scraped again. To learn more, visit www.DistilNetworks.com.
Want to Reuse or Republish this Content?
If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.