Most Web users are at least somewhat familiar with the DMCA safe harbor provisions and the notice and takedown system it provides. In short, hosts and other service providers are granted a “safe harbor” from copyright infringement claims so long as they are unaware of the infringement and work to “expeditiously” remove infringing content after receiving a proper DMCA notice.
However, there is a second, much lesser known, case where hosts can be required to remove allegedly infringing material. According to § 512(c)(1)(A)(ii), the law requires hosts to remove content where they are “aware of facts or circumstances from which infringing activity is apparent.” These are often known as “red flag” infringements and Ben Sheffner highlighted them in a blog posting on the recent Veoh ruling
They are an interesting element to the safe harbor protections that gets very little attention. A situation where a host can be forced to remove content without receiving a takedown notice. However, when one looks at the thin case law on this particular kind of takedown, its easy to see why you probably haven’t heard of it. It’s never effectively been used.
However, perhaps most discouraging of all is that, even if it did become a regular part of our legal climate, it would do almost nothing to protect the vast majority of copyright holders.
Red Flags and Bulls
According to the DMCA (page 53) the test for whether an infringement raises a “red flag” or not is “whether infringing activity would have been apparent to a reasonable person operating under the same or similar circumstances.” In short, similar to the “ordinary observer” test when looking at whether a work is a derivative or not, the statute is both subjective and objective, leaving a lot of room for interpretation.
However, the interpretations have been almost universally in favor of hosts. The most important case for such takedowns, also cited by Sheffner, is the Perfect 10 v. CCBill case, where pornography company Perfect 10 sued CC Bill for providing hosting and credit card services for companies and sites misusing their content. Perfect 10 tried to get the court to rule against CCBill saying they had ignored red flags, however, the court didn’t agree.
This came despite the fact that some of the sites had names like “Illegal Dot Net” and “Stolen Celebrity Pictures”. The court found that these cases shouldn’t have raised red flags as the names could have been “an attempt to increase their salacious appeal, rather than an admission that the photographs are actually illegal or stolen.”
This has lead many to wonder what exactly constitutes a “red flag”. In that regard, the Veoh case may offer some clues as it does reference Veoh employees seeing likely infringing videos on the site and removing them without the need for a copyright notice. As Law Geek puts it when discussing the Perfect 10 case, providers must have concrete or actual knowledge of the infringement, something that is difficult to obtain and even more difficult to prove that they had. That seems to be reinforced in the Veoh case.
Imagine trying to prove, based up on hard evidence, that a host had a reasonable level of knowledge about an infringement and failed to act. If sites with names containing words such as “illegal” and “stolen” are not enough to raise red flags, then there is almost no way to show what does and that is a big part of why this particular element has never, to my knowledge, been successfully used.
However, this is not something to fret over, even if it were useful, it’s hard to see how it would protect the majority of copyright holders.
The Problem with Red Flags
The issue with red flags is pretty simple. Most copyright holders aren’t recognizable enough to trigger a red flag for a host. Your average photographer, blogger, etc. doesn’t have enough notoriety to either prompt a layperson working for a host to remove the content or to sue the company for failing to act.
Where an episode of “Heroes” appearing on YouTube almost certainly would raise a red flag (not to mention be caught in the content filter) a recycled post from a videoblogger would not.
There are two reasons for this, the first is notoriety, the second is licensing.
Even with a popular site the odds of one’s work being immediately recognized are fairly slim. However, even if it is, one can not make a judgment if the use is licensed. Where everyone knows where NBC/Universal stands on their works being on YouTube, it is much less clear where a layblogger does.
The DMCA expressly prohibits hosts from being forced to research to gather new information or actively patrol their content to remain DMCA compliant (though some use content filtering to keep major content providers happy and reduce costs on enforcement). As such, it is almost impossible to imagine a situation where an average blogger would be protected by a “red flag” case, unless their work was part of a larger site removed.
Unfortunately, this doesn’t mean that a user could just report an infringement and be done. That would likely be viewed as an end-run around the actual notice and takedown system, which requires either the copyright holder or an authorized agent to file the notice. However, it is unclear if this would be valid for a red flag takedown (I was unable to locate any cases) or if you could file other abuse complaints (IE: Spam) to draw attention to such a site, essentially waving a red flag without filing an actual notice.
However, for most copyright holders, this would be a tremendous waste of time. It is much easier, faster and direct to just file a takedown notice than try and trigger red flags through various other means. The only cases where this might be useful would be in protecting other’s rights, but there it would be just as easy to become an agent authorized to act on their behalf and help with the case through more traditional and reliable channels.
The good news is that hosts are routinely removing content from their servers that is infringing. The bad news is that it probably isn’t yours.
Even apart from the “Red Flag” rules of the DMCA, hosts have a lot of motivation to keep infringing material off of their servers, the biggest being that it doesn’t support an atmosphere of creativity and originality of much of the content is just illegal copies of other’s work.
However, even hosts that don’t worry about their community, sites who infringe egregiously also tend to engage in other illegal behaviors including spam (Web and email), distributing malware and so forth. These sites tend to be resource hogs and hurt the reputation of a host both with potential customers and with the search engines.
Legitimate hosts have tons of practical motivation to keep obvious infringements off their services. But, as I’ve said, that won’t help most copyright holders as the infringements of their work are, for the most part, less than obvious.
This serves to highlight why smaller copyright holders need to learn how to properly use the DMCA system to remove infringing content that the infringer themselves will not act on. It is, in many cases, the best and most powerful tool for dealing with plagiarists and spammers.
As flawed as it is and as challenging as it is to use at times, its usefulness makes it impossible to ignore.