According to Calacanis, Usenet is an case study on what is currently happening to the Web. In his speech, he stated that Usenet, ten to fifteen years ago, was a very useful tool but has since become so inundated with spam that it is almost never used for any serious communication.
The problem, he says, is happening to the Web itself and it is occurring largely because platform providers are turning a blind eye to abuse of their services. He specifically points to spam and content theft problems on Squidoo, abuse of Blogger and the gaming of search engine results as examples.
According to Calacanis, in order to save the Web, the providers of these platforms are going to have to actively police their services to prevent abuse. He feels that it is not enough for a host to “hide behind” the DMCA and only remove works when notified, instead, they have to work to prevent abuse of their systems, even if it means making less money.
Calacanis goes on to offer his current venture, the human-powered search engine Mahalo, as an example of such a human layer and a spam-free service.
Though not specifically mentioned, it also seems logical that Calacanis would also support the proactive steps or Revver, which manual approves videos before posting, and WordPress.com, which has remained relatively spam free due to a combination of enforcement and user restrictions.
Overall, I agree with Calacanis’ point that there is a need for greater enforcement and a willingness to make less money in order to keep the Web clean of pollution. However, I do have a few minor criticisms of the presentation.
First, I think the example of Usenet is somewhat flawed. Yes, Usenet was hurt by the spammers that eventually took it over, but much of the downfall of the protocol was due to the Web itself. It shouldn’t come as a surprise that Usenet started tapering off about ten years ago, just as the Web starting to gain traction.
Much of the functionality of Usenet was replaced by message boards, forums and classified ad sites. Since these sites could provide the services in a way much more easily understood by the new crop of Internet users, they were the ones that that reached critical mass.
However, the bigger problem I have is the notion that a human layer can resolve the problem. Human-edited services have been around for over a decade, the most famous being DMOZ. Though relatively spam free, DMOZ, much like Wikipedia, introduced us to the backroom bickering and corruption that can infest these services.
In short, if human-powered service is too centralized, you have to contend with power struggles and a general lack of transparency. If the service is too open, there is little to prevent the spammers from just submitting the junk themselves the same as they do now.
If a balance can be found, such a system can work, but there is little doubt that, historically at least, most human-powered engines have solved the spam issue at the expense of injecting other, more human, flaws into the service.
Overall, I like the thrust of the Calacanis speech and I find little to disagree with. It is nice to see an entrepreneur taking a stand against Web garbage and it is something that can benefit copyright holders, users and future business leaders alike. A healthy ecosystem reduces spam, content theft and makes for good business over the long haul.
Yes, it might mean companies making less money in the short run and some companies might die, namely those whose business models have become so intertwined with spammers that they can not be separated, but it is a small price to pay.
However, in the end, this speech will likely have little effect on the Web at large. The selfish interests of companies and spammers alike keep everyone focused on the short term and pretty much promise that we will be staying the course for the foreseeable future.
It is only after spam begins to hurt the immediate bottom line that we are going to see a real change in course. However, by that time, it might be too late.
Let us hope that isn’t the case.