Six Apart/Rojo: Now Spam Bloggers?

SixApart Logo– Article Updated – See Below –

Six Apart was one of the first rock stars of the blogging world. Propelled to fame on the back of its Movable Type blogging platform, it quickly became one of the most recognized names in the blogging world.

Though Movable Type has largely been replaced by newer blogging applications, including Wordpress, Six Apart has remained very active in the blogging world, not only offering Typepad, a popular blogging service, but also purchasing several other blogging comapnies, including LiveJournal and Rojo.

However, some of these subsidaries have begun engaging in practices that many bloggers consider unethical. One of the sites under Six Apart’s control even engages in behavior akin to Bitacle.

This has left some to wonder why Six Apart, a company largely respected in the Blogging world, has begun to play fast and loose with RSS feeds and copyrighted content. Worse still, why have they begun using tactics largely reserved for spam bloggers?

Sadly, the answers are not very clear.

LiveJournal Syndication

The least worrisome of Six Apart’s scraping activities revolves around their LiveJournal service. There, paid members can take advantage of their “Syndication” feature. It allows users to select an RSS feed and LiveJournal then creates a specialized page for the feed. The feed can then be added as a “friend”, the same as if it were an actual LiveJournal member, and can appear in friend lists.

The Syndication feature is worrisome because it creates an “account” with duplicate content from the feed. The site displays the entire contents of the feed (see sample using Neil Gaiman’s Journal) and allows users to post comments without returning to the original site.

However, with the LiveJournal Syndication service, attribution is very clear and all synidcated accounts are on a separate subdomain (syndicated.livejournal.com). Also, the LiveJournal team has, historically, been very responsive about removing feeds that their owners don’t want to be scraped. Furthermore, results from the Syndication service do not appear in Google eliminating most of the major concers one has with scraping.

Still, many bloggers are likely to be concerned that a duplicate of their blog exists, that users can and do comment to it and that LiveJournal users no longer need to subscribde to the feed directly or visit their site.

Rojo Front Page

Rojo ScreenshotWhen Six Apart aquired RSS reader Rojo in September 2006, it also aquired some of Rojo’s bad habits.

Rojo’s home page functions almost exactly like a rapidly-updating spam blog. It features the full content of the most popular feed items of the day, all next to Google Adsense ads (see screenshot above). The site is then further sub-divided into new categories, including “politics” “Web 2.0”, etc., it is also possible to view the original feed on Rojo without visiting the original site (see PTs feed on Rojo) and those feeds are also surrounded by ads.

Attribution on Rojo is prominent and the headlines do link back to the original story. However a “Rojolink” feature encourages others to use the Rojo permalink for the article rather than link to the original site.

At the very least, Jason Calacanis will likely be upset by this. He has repeatedly stated that he will not allow his full feeds to be placed next to ads, something that Rojo does.

Though most people expect RSS readers to make money off of other people’s content, generally it is also expected that they will add value to the feed by making it easier for people to subscribe. Instead, Rojo has just created a valueless duplicate of the feeds, and surrounded the content with ads.

All The Nooz

Worst of all Six Apart’s properties though is the Rojo-owned site Nooz.com. Nooz is designed to function like Digg for Myspace. Nooz users pick articles from the Web, vote on them and add them to their special Nooz widgets that they they place on their Myspace profiles.

The problem with Nooz, however, is not the widgets but the way the content is obtained. Rather than letting users select their own articles from the Web, like Digg or Reddit, Nooz forces users to select from versions of the blog that it has scraped and reposted on its own site (see Plagiarism Today on Nooz). Once again, as with Rojo itself, Nooz offers “Noozlinks” to encourage people to link to Nooz’s scraped copy, rather than the original.

Though no ads appear on Nooz at this time, Nooz.com is accessible by the search engines, Google estimates that about 150,000 pages have been indexed already. Even worse, all of the contact addresses for Nooz, including the copyright agent, all bounced back.

Nooz is not only scraping and reposting feeds without permission, but it is being irresponsible in doing so. There is no means to ask Nooz to stop reusing the content.

If you don’t like the way Nooz uses your content, quite frankly, you are out of luck at the moment.

A Murmured Outcry

Six Apart is no stranger to blogging, as discussed above, they helped ignite the blogging movement with their software. They are not unfamiliar with the ettiquite of blogging and should realize, at least on some level, that some bloggers will not ba happy to see their feeds scraped and republished on someone else’s site, all the while surrounded by ads.

The reasons Six Apart allows this to continue are dubious at best. Legal scholars have already agreed that there is no implied license with RSS feeds, this use, as long as it is executed without permission, is basically copyright infringement. Unless a CC license or a direct agreements permits the use, what Six Apart is doing in all three cases is, most likely, illegal.

To my knowledge, no one has complained about these three uses for the following reasons. Why is a mystery, but the reasons may include the following:

  1. Very few people seem to be affected by the LiveJournal Syndication feature. Since only paid members can take advantage of it, severely limiting the pool, only very large blogs are scraped. Also, LiveJournal has been very cooperative in removing people that don’t want to participate. Furthermore, since the Syndicated blogs are not picked up by search engines, it’s unlikely most bloggers know that they exist.
  2. Few bloggers want to upset Rojo since many readers use the feed reader service to subscribe to blogs. Currently, about 5% of all Plagiarsim Today subscribers use Rojo.
  3. Nooz seems to have flown under the radar, targeted mostly at Myspace users, generally a separate group from bloggers, and still a relatively new creation (its current incarnation starting some time this year).

No matter the reasons though, these issues are not going away. RSS scraping and reuse issues will likely be around for a very long time, that is, until a licensing scheme emerges that resolves the issue once and for all.

Conclusions

What Six Apart is doing is wrong. Though I have no major issues with their use of my content, save perhaps on Rojo where the use is more commercial (and thus a violation of my Creative Commons License), Six Apart is taking content from thousands of blogs, without permission, and reposting them on various sites. That is copyright infringement and there is little way around that.

Though some might argue that Six Apart’s scraping would qualify for protection under the DMCA (section 512(b)) protection for caching services. However, as discussed earlier, that is not likely the case.

All of Sixapart’s sites modify the content and create permanent files, both violations of the caching provision. It also does not follow accepted practices (as there are no accepted practices for scraping and republishing RSS feeds) and it is not automated, seemingly relying at every step on users to submit the original feed.

It is unlikely, at best, that Six Apart would obtain the same kind of protection that was afforded the Google Cache, especially considering both the commercial nature of the use and the apparent intent of setting up the copy as a substitute for the original. The latter is shown by the new permalinks and location of cached material (placed before the link to the original).

Six Apart desperately needs to look at its policy for reusing others content. In that regard, it should look toward sites such as Digg and Reddit that have built great communities without infringing on copyright.

In short, there’s no reason for a social news site to scrape and repost content like Rojo and Nooz currently do. Links and snippets are perfectly adequate.

When it’s all said and done, Six Apart seems to have nothing to gain by scraping and reposting content as it does. Successful news sites have, for a very long time, worked well with content creators and there seems to be no reason for Six Apart to try and change that, especially in a way that is both legally dubious and likely to cause outrage.

Hopefully they will reevaluate their policies soon and come up with a more fair approach to its sites. In the meantime, they are treading on very thin legal ice and dealing with a very wary public.

Hat tip: Thanks to Cybele of Typetive for the heads up about Nooz.com

Note: During the course of writing this article, which started Thursday, I made several attempts to contact Six Apart by both email and phone. I was able to get in touch with Jane Anderson, Six Apart’s press contact. We scheduled a time for an interview on Monday but, when I called in there was no answer. Subsequent attempts to contact Six Apart via both office phone and cell phone have produced no answer. I will update this article when and if I get further information from them.

Update: I’ve gotten back in touch with Jane Anderson, she is speaking with her counterparts at Six Apart and will be back in touch with me soon. They have scheduled a meeting for tomorrow to discuss these issues. I will report back after I hear from them.

Want to Reuse or Republish this Content?

If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.

Click Here to Get Permission for Free