Why Repealing Section 230 is Shortsighted
Section 230 of the Communications Decency Act (CDA) is probably one of the most simple and direct laws that governs the Internet.
It says, quite plainly:
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
That distinction might seem trivial, but it has the impact of removing a service, whether it’s Facebook, a forum or even the comments on this site, from liability for content posted by users. This includes content that is libelous, harassing, obscene or otherwise illegal.
While there are exceptions to the protection, such as with intellectual property laws, it provides a pretty broad legal protection for sites and services that host content uploaded by third parties.
The law has been in effect since 1996, the same year the Digital Millennium Copyright Act was passed, but it’s recently become the subject of controversy after Arthur Chu published an article on TechCrunch calling for it to be repealed.
According to Chu, the CDA has become a tool that enables online harassment and libel. By giving sites blanket protection from their users, he believes many sites have, either through malice or neglect, become havens for trolls, libelers and worse.
Since it’s impossible to target these sites, just the end users who post the material, it’s created an Internet where people can and do post horrible things, including doxxing, stalking and more.
But is repealing it the right answer? That’s a tough question but it’s one that clearly is not black and white.
The Response
Chu’s post has met with widespread criticism from a variety of sources including Brietbart, Ken White (AKA: Popehat) and more.
The gist of the counter-argument is that repealing Section 230 outright would make every website just as responsible for content posted on it as a magazine is for every article it runs. For example, I would have to take editorial responsibility for every comment posted on this site. Furthermore, sites would be forced to remove any and all content reported to them because it would be untenable not to. Negative reviews, even accurate ones, could be easily silenced.
Basically, repealing the CDA would treat Facebook and YouTube as if they were newspapers or TV stations, two things they are decidedly not. That creates serious problems both pragmatically and ethically. Pragmatically, it would be impossible for sites like Facebook and YouTube to take editorial control over the content they posted, ethically it would create a new weapon for the very trolls Chu hopes to defeat by making it easy for them to order the removal of content.
For myself and the other critics, this makes the repeal of the CDA a non-starter. Not only will it not make the Internet safer from trolls and harassers, but it makes even remotely free speech on the web impossible.
Instead, there has to be a better way.
Nuance Into the Law
Depending on how you look at it, the best or the worst part of the law is it’s simplicity. It takes an “all or nothing” approach to the issue, making all web publishers immune from liability from nearly all forms of third-party communications.
However, there is an interesting exception: Copyright law.
Host liability on copyright governed by the Digital Millennium Copyright Act (DMCA), which states that a host is not liable so long as they work expeditiously to remove infringing material after proper notification. This is known as the notice-and-takedown system.
To be clear, this system is highly controversial both by copyright holders who say it doesn’t go far enough and leads to Wac-A-Mole with infringements and those who say it’s frequently abused to remove unwanted but non-infringing content.
Still, Facebook, YouTube, etc. have all been able to survive and thrive under such a system. While I don’t necessarily think a notice-and-takedown system is right for other types of unlawful content, such as harassment and libel, it shows that the “safe harbor” doesn’t have to be all or nothing.
For example, it could take other formats:
- Inducement: We could treat it like the Grokster ruling and look at sites that induce unlawful behavior. For example, a site dedicated to revenge porn site would be treated differently than one that just happens to have some revenge porn uploaded to it.
- Policy Based: Under the DMCA, a site is also required to have a policy for dealing with repeat infringers. A similar set of rules could be applied to harassment, libel, etc. requiring sites to have and follow policies, thus weeding out the deliberately negligent.
- Notice-and-Takedown: Finally, though it would be fraught with problems, a notice-and-takedown system could be used similar to the DMCA, but one that would provide protections against false notices and a system for restoring erroneously-removed work.
To be clear, I’m not condoning, supporting or advocating for any of these. I’m merely showing that there are steps in between no liability and complete liability. Ways we can encourage sites to be better actors without completely stripping them of their protection.
BackPage and Peeple
The timing of all of this discussion is interesting because, in early September, the advertising site Backpage suffered a defeat in the Washington State Supreme Court that directly addresses its protections under the CDA.
The case deals with a series of Jane Doe plaintiffs who sued Backpage for its role in trafficking of underage women. Backpage claimed protection under the CDA, saying that it was merely a neutral party. However, the court ruled that Backpage was less than neutral, saying it intentionally developed its site for illegal purposes, wrote its content requirements accordingly and that it had a “substantial role” in creating the content.
As such, the court denied a motion to dismiss on CDA grounds, setting the matter up for a trial in the lower courts.
Still, the very intent of the CDA was to prevent these kinds of cases from getting past a motion to dismiss. The law was meant to be a quick way to end such litigation and, after this, that role is much less certain, as Eric Goldman pointed out in his teardown of the ruling.
The report also comes just as a new company, Peeple, is creating what it considers a Yelp for People. Fears of abuse and ethical concerns surround the site though most are confident that Section 230 will protect the site and its operations.
The site has drawn broad condemnation, including from sone of Section 230’s biggest supporters, who fear that sites like it will motivate legislators to weaken the act.
These stories are starting a conversation about just how far the protections of Section 230 can and should go and while Chu most certainly takes things way too far, it’s easy to understand why many are seeking reform.
Bottom Line
Are there horrible things on the Internet? Absolutely. Online harassment, swatting, doxxing, abuse, libel and the list goes on. Are people’s lives ruined by it? Absolutely. Should we be doing more to fight it. Once again, absolutely.
However, outright repealing Section 230 does more to cripple the Internet than it does to fix these issues. Facebook, Twitter, YouTube, Tumblr, etc. are not magazines or TV stations. The laws that we drafted for print can not and should not apply to online companies.
If we are going to repeal the CDA, we need new laws. Though this is something that Chu was expressly against, trying to govern a web host as if it were a newspaper makes no sense.
To be clear, in many ways I would benefit from the repeal of Section 230. I could trivially remove comments from PT and then my work fighting revenge pornography would get much easier. I could simply notify sites that the content is in violation of state laws, which it often is, and have another avenue for removal.
However, those gains would be far outweighed by what is lost. If we decide we need a legislative solution to these online problems, we need something more nuanced than a simple repeal. Fortunately, we’ve shown that such laws can function without killing the Internet we love, it’s just going to require more thought and care than simply hitting “delete” on a law that’s nearly 20 years old.
Special Thanks to Patrick O’Keefe and his write-up, which helped inspire this post.
Want to Reuse or Republish this Content?
If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.