Reddit’s Non-Stand Against Revenge Porn
Earlier this week, social news and networking site Reddit announced that it was changing its sites policies and that it would begin reporting and removing “involuntary pornography”, nude and sexual images uploaded to the site without the model’s permission.
On the surface, the policy sounds great and serves as a complete about face from the debacle in September over the celebrity nude photo leaks.
In that case, citing free speech and a hands off administration policy, Reddit flatly refused to remove nude and sexual images of celebrities unless they were served with copyright notices, even though they knew that the photos involved were uploaded without permission. Reddit only removed the main community dedicated to the leaks after it generated so many legal demands that it became burdensome to Reddit’s operations.
However, many are not convinced by the new policy including my friends and allies at the Cyber Civil Rights Initiative, who told Motherboard that the policy, by itself, is almost meaningless without both cooperation from Imgur, the site’s predominant image hosting service and a policy to ban people who post such images.
Reddit didn’t do much to calm fears such as these when it answered questions about the policy change on Buzzfeed. Not only did it reiterate that there would be no additional penalty other than post removal and that it will not be policing its site, Reddit also deflected questions about how it would ensure images aren’t reuploaded or about consent in general.
For me, the biggest issue is staring Reddit right in the face and they are declining to take action. It deals with communities that already exist on the site and have thrived under Reddit’s policies.
A Culture of Involuntary Porn
Right now on Reddit, there are several communities that exist primarily for the purpose of posting nude and sexual images of those who did not consent to have them posted. Several were mentioned in an article on Vice and I touched one serious offender in my post about how photo sharing sites can prevent involuntary porn.
When a Reddit community, more commonly referred to as a subreddit, can be founded and operate with the express or primary purpose of distributing involuntary porn without fear of reprisal, simply removing images when notified does little, if anything.
In fact, given that Reddit already is forced to remove images and links to copyright infringing material, this policy change only helps a subset of victims who had their images posted and don’t hold the copyright.
In short, almost no one will really benefit from this.
In the past, Reddit and others have argued that shutting down unwanted subreddits is nearly impossible because they routinely crop up under new names. As mentioned in the article above, creepy subreddits that were closed have routinely reemerged, often with thin disguises as something else.
But as Patrick O’Keefe is fond of saying when someone asks why he bans users when they can just come back, just because they can doesn’t mean you don’t try. “Banning isn’t just about preventing someone from accessing a community. It’s about creating a consequence for people who demonstrate that they don’t want to treat the community with respect. That consequence is greater than just being blocked.,” O’Keefe said.
Basically, banning a person, or in this case a subreddit, destroys the work and effort that had been put into it, forcing the members to start over. Some will eventually tire of rebuilding the sandcastle.
Applying this policy not just to individual photos, but to communities would be a much better start. However, Reddit has made it clear that it will not do that.
Bottom Line
In the end, Reddit is probably right. They don’t need to hire new staff and probably don’t need to do much special with this new policy. It’s a policy that only pays lip service to the serious involuntary porn problem the site has.
While I’m sure some people will be helped by that policy, that number is limited to victims who are aware their images are on Reddit and lack the copyright to the photos, a very small group.
An effective involuntary porn policy would do more than simply remove reported images. It would ban users who post them, in particular repeat offenders, shutter subreddits that specialize in involuntary porn and work closely with third-party providers, like Imgur, to ensure that works are removed from all relevant places.
Reddit’s policy does none of these things. Instead, it tries to toe a line between being a champion for hands off administration and a champion for the rights of involuntary porn victims. However, this policy fails to do either. It is a legally unnecessary intrusion into its users that fails in its key objective.
When March 10 comes around, revenge and involuntary porn victims will, at best, only be marginally safer on Reddit.
Now, to be clear, I don’t think that Reddit has the obligation to vet and approve every piece of content posted, that is unreasonable. But it certainly has the ability and the responsibility to do what it can to protect victims and its current approach falls far short of that.
Want to Reuse or Republish this Content?
If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.