Breaking Down the New EU Copyright Bill: Article 13

Filtering the message on filtering.

EU FlagWith the controversy surrounding the new EU Copyright Directive, one article has received the lion’s share of attention: Article 13.

Whether it’s being called a “war on memes” or the “end of all that’s good and pure about the internet,” there’s been a great deal of focus on the article and what it may or may not do to the internet.

However, as we did last time with Article 11, we will take a dive into the actual language of the bill and attempt to interpret what it actually says. To do this, we’re going to look at the most recent version of the bill as of this  as provided by MP Julia Reda, a staunch opponent of the directive, and try to determine both what the article says and who it will impact.

The answer, as with most things dealing with legislation, is complex. But one thing that is clear is that the narrative around the article is oversimplified, especially as the article has gone through so many changes since it was first proposed in 2016.

Note: For more information about the history of the new EU Copyright Directive and where things sit today, please see the previous article, which covers that and looks at Article 11, the “link tax” article. 

Who Article 13 Affects

When compared to the original proposal, Article 13 has undergone a major overhaul, to the point that almost none of the original text is in there. One of the most important changes is the title of the article, which also declares who it governs.

In the original draft, the article was entitled:

“Use of protected content by information society service providers storing and giving access to large amounts of works and other subject-matter uploaded by their users.”

As we discussed in the previous article, “information society service providers” is a broad term that encompasses almost any online service from a blog to a major social network. This would have made it so that any site or service, once deemed sufficiently “large”, would be beholden to the article.

In the latest draft, the title of the article is changed to read:

“Use of protected content by online content sharing service providers”

The act also defines an “online content sharing service provider” as:

“A provider of an information society service one of the main purposes of which is to store and give access to the public to copyright protected works or other protected subject-matter uploaded by its users, which the service optimises.”

Though this new definition removes the need for a provider to be sufficiently large, it does provide other, crucial limitations.

For a service to be affected, it has to have the “main purpose” of giving access to copyright-protected works uploaded by users. This means that individual blogs don’t need to take action just because they have a comments section, but Facebook likely would.

Similarly, it requires that a service somehow optimize the content, which is another important distinction. Hosts like GoDaddy, where a user pays to host websites, don’t optimize or alter user content. As such, they would not likely be affected, at least not for their hosting service.

The definition then goes out to carve a list of other exemptions, those include:

  • “Services acting in a non-commercial purpose capacity” (specifically citing online encyclopedias).
  • Providers where the content is uploaded with permission from all rightsholders (specifically citing “educational or scientific repositories”)
  • Providers of cloud services that don’t provide direct access to the public.
  • Online marketplaces where the main activity is selling physical goods.

The latter is especially interesting. Many would assume that an online storefront that actively sells copyrighted works would have a higher level of responsibility than a social network, but the definition says otherwise.

This is especially odd in the face of widespread complaints of copyright infringing photos and artwork being sold on Amazon, which Amazon is often slow to remove.

Regardless, the law makes it very clear who would be directly impacted by this: Commercial providers whose “main purpose” (or one of their main purposes) is “to store and give access to the public” copyright protected works that it somehow optimizes or alters. There is no exemption for size in the current draft, so even startups could be held to task.

But what exactly is that task? To determine that, we have to look at the act itself.

Obligations Under Article 13

The article actually begins by saying that such providers, when possible should obtain licenses from rightsholders to cover the use of their content by their users. However, when such a license cannot be obtained, and this is where the article is at its most controversial,  service providers are ordered to:

“Take, in cooperation with rightholders, appropriate and proportionate measures leading to the non-availability of copyright or related-right infringing works or other subject-matter on those services, while non-infringing works and other subject matter shall remain available.”

The article never uses the word “filter” or “filtering” anywhere in it. However, since the provider has to ensure the non-availability of infringing works, it is pretty clear that a filtering system would be required.

The article doesn’t delve into the technical details of how the system would work. Instead, it focuses more on the transparency of it saying that the provider must:

“Be transparent towards rightholders and shall inform rightholders of the measures employed, their implementation, as well as when relevant, shall periodically report on the use of the works and other subject-matter.”

Most importantly, the article puts in checks and balances to ensure that the system won’t be abused, requiring providers to:

“Put in place effective and expeditious complaints and redress mechanisms that are available to users in case of disputes over the application of the measures.”

The article goes on to state that complaints filed under those mechanisms “shall be processed without undue delay” and that rightsholders must justify their decisions.

The act then goes on to state that member states should facilitate cooperation between providers, rightsholders and users to find “best practices” for applying these tools.

The Likely Impact of Article 13

Unlike Article 11, the steps in Article 13 have never been tried by individual member states. This would be the first time such a proactive approach had been legislated anywhere.

However, even without legislation, we’ve already seen widespread use of filtering tools. Currently, YouTube, Facebook, InstagramSoundCloud, Twitch and more all use some form of filtering technology.

But, even without such tools, rightsholders often use automated tools to track and send takedown notices, a behavior that has already been sanctioned by the Ninth Circuit, meaning that copyright enforcement is largely in the hands of automated tools, regardless of how it is enforced.

But, while Article 13 would not represent a genesis for filtering tools, it would represent an expansion.

Currently, filtering technology is used almost exclusively for audio and visual content. Article 13 doesn’t make any exceptions for type of content, meaning we would likely see it for images and text content, so long as the provider “optimized” the content in some way.

This is where the “war on memes” barb comes from. The theory is that, since many hosts will have to filter images, photographers and artists who created the original meme images, can simply have the images filtered out.

However, the artists behind memes have, in general, not expressed much interest in stopping the use of their work. Memes have not become the targets of mass copyright complaints and artists typically don’t avail themselves of the tools they have today. There’s little reason to think that many would do so should filtering become mandated.

In short, a “war on memes” is not a likely outcome of Article 13 simply because rightsholders aren’t looking for it. However, this doesn’t mean that images won’t be at the forefront of the expansion of filtering, stock photo agencies, despite being generally permissive of non-commercial uses, have a long history of being aggressive against commercial use of their work. Exactly how they will respond to Article 13 is unclear.

The larger impact is the shift from filtering as a tool of expediency to an obligation.

Right now YouTube, Facebook, etc. use filtering technology out of expediency. If they wanted to, they could drop it tomorrow. While they’d undoubtedly be hit with a tidal wave of copyright notices, they could do it.

Article 13 not only mandates that filtering, but sets standards for what the transparency around that filtering should be. That, in turn, would force YouTube to negotiate for a license the same way that Spotify or Apple Music does. If YouTube doesn’t negotiate a license, there’s no fallback of letting users upload it and then removing it after a notice. Instead, it’s either have a license or not have the song.

MP Axel Voss, who is Parliament’s rapporteur on the directive, has said that addressing the “value gap” is not the goal of the law. He simply says many large companies earn a great deal of money off user uploads, including infringing ones, and that those companies need to take greater responsibility for fighting copyright infringement.

Regardless of intent though, the “value gap” will be one of the things that this law will impact if it passes. Changing the legal obligations will change balance of power when negotiating licenses.

In short, Article 13 shifts the balance of power from providers to rightsholders on these issues and that will have a profound impact on the internet. While it might not be as catchy as a “war on memes” it’s definitely something that needs to be discussed.

Unfortunately, that’s not the conversation that most of the internet is having when it comes to Article 13.

Bottom Line

Much of the fear and concern about Article 13 has centered around the idea that the filters can be abused, that the technology isn’t perfect and that they will inevitably prevent at least some non-infringing speech.

All of that is true.

However, that argument misses an important detail: Filtering is already a fact of life and its use is only going to continue to grow, regardless of whether the directive is passed. The largest sites for social networking, sharing video and sharing audio all have filters in place. Once a site or service reaches a certain size, it’s the only practical way to handle copyright issues.

The issues around filtering aren’t a dystopian future if Article 13 becomes law, but a reality today. While Article 13 will certainly accelerate the spread of filtering, requiring it on content sharing services of all sizes and all types, that expansion is likely happening anyway, simply through the growth of the technology.

If it becomes law, the biggest change won’t be creating a filtering nightmare, it will be the way it shifts power away from online services and toward rightsholders. Though the law will certainly change who filters and what they filter, it’s the “why” question that is much more important.

In the end, a war on memes isn’t a likely outcome and it is equally unlikely you’ll see massive censorship on Facebook, Twitter, Reddit, etc. Unfortunately, the hype and rhetoric around the directive have made it nearly impossible to discuss the more probable changes it will bring.

This, in turn, is a problem we’ve seen before and will likely see again in the future.

Want to Reuse or Republish this Content?

If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.

Click Here to Get Permission for Free