BY JULIA REDA, MEP
Last spring, 200,000 Europeans took to the streets to protest against a new EU copyright law that risks to restrict online culture and block vast numbers of legal online communications such as memes, reaction gifs, video game reviews or remixes. It is the latest clash between a generation that has grown up with the Internet as a means of cultural expression and a much older generation of lawmakers who prioritize the interests of entertainment companies over online culture. Although the protests were sparked by EU legislation, US academics and activists should be paying close attention.
Ever since the adoption of the General Data Protection Regulation, EU regulation of online platform companies has become a topic of global interest. Not only are European policy-makers keener than their US counterparts to regulate the mostly American tech companies that have gained significant market power over the last two decades. For better or for worse, the European Union has increasingly become capable of setting global regulatory standards, through the inclusion of its internet legislation in trade agreements, or by making compliance with these rules a precondition for accessing the vast EU market of over 500 million consumers.
Copyright Maximalism Exported
The entertainment industry from both sides of the Atlantic has recognized these conditions and chosen the EU as the ideal venue for pushing through extensions of copyright law, by presenting these demands as instruments to curb platform power. This is how the new EU copyright directive’s most controversial provision, Article 17 (previously known as Article 13), came about. Despite widespread protests, the public interest in free cultural exchange online has largely been drowned out in what has been presented as a conflict between the entertainment and tech industries. European lawmakers sided with the entertainment sector amid an atmosphere of ever louder criticism of US tech companies’ business practices in Brussels. Google and Facebook in particular have grown so unpopular with EU lawmakers over legitimate data protection, antitrust, and taxation concerns that any proposal that promises to curb their power is likely to gain traction. There is widespread academic consensus that Article 17 fails to meet its objectives and poses a significant danger to freedom of speech online. Nevertheless, the provision was narrowly adopted by the European Parliament and the Council, and it is not unlikely that similar rules could soon be pushed in the US.
There is a tried-and-tested tradition of Hollywood companies lobbying for stricter copyright in Europe, just to turn around to US policy-makers to demand the same extensions be enacted in domestic law, in order to “stay competitive”. This strategy was successful in the case of the Sonny Bono Copyright Act of 1998, which extended copyright terms to 70 years after the death of the author, following a similar term extension in Europe in 1993. The purpose of the term extension, according to the Senate report, was to “provide significant trade benefits by substantially harmonizing U.S. copyright law to that of the European Union”. Of course, the EU would have been unlikely to extend its copyright terms to begin with, had it not been for the lobbying by American entertainment companies. It should therefore not come as a surprise that the US Copyright Office has repeatedly referred to the work on the new EU copyright directive as a reason for considering the overhaul of the US framework for copyright liability of online platforms, enshrined on section 512 of the Digital Millennium Copyright Act.
New Platform Obligations
The liability rules for online platforms regarding copyright infringements by their users have been very similar in the US and the EU up until now. Both legal frameworks exempt online platforms from liability for copyright infringements of which they have no knowledge, and require them to immediately remove any copyright infringements that are brought to their attention. In the EU, this notice-and-takedown regime is going to change significantly in the summer of 2021, when the newly adopted Directive on Copyright in the Digital Single Market must be incorporated into all national copyright laws of EU member states.
Article 17 outright removes the liability limitations for certain for-profit online platforms, the so-called safe harbor, and makes them directly liable for copyright infringements by their users. These platforms are then given a new safe harbor with much more onerous obligations. It is no longer enough to remove copyright infringements that are brought to their attention. In addition, they have to:
1. Make best efforts to get a license from the rightsholders;
2. Make best efforts to block uploads of copyrighted material that the rightsholders have registered with the platform; and
3. When they have received notice of a copyright infringement and removed it, make best efforts to ensure that the same copyrighted content can’t be infringed again.
Only platforms that fulfill all three criteria will be shielded from liability for the copyright infringements that they miss. Anybody who is familiar with copyright and online platforms will realize that mistakes will invariably happen.
Safe No More
It’s impossible for platform operators to predict which copyrighted works by which rightsholders will be uploaded to their platform. Since copyright arises at the moment that a work is created, nowadays every person with a smartphone is a copyright holder, and most people frequently upload their works to the web (mostly in the form of pictures). It is therefore absolutely impossible for platforms to contact all rightsholders in the world and ask them for a license, on the off chance that one of their works might be uploaded to the platform at some point in the future. The directive is also silent on whether platforms would be forced to accept unfair license conditions at any cost to their own business.
The real effects of Article 17 will depend on the interpretation of “best efforts”. What level of diligence in blocking content, identifying rightsholders and negotiating license agreements can be expected of platforms depends on many factors, including their size, profitability and the types of content they host. Article 17 exposes platform operators to a lot of legal uncertainty and it will take many years for courts to hash out what these rules will mean in practice. The outcome is likely to privilege large, commercial rightsholders like the entertainment industry giants over independent artists.
Filters Fail Frequently
The obligation to make best efforts to automatically block copyright infringement at the point of upload is even more controversial than the obligation to seek licenses and has drawn justified criticism as a censorship instrument. Although the directive says that legal uses of copyrighted content such as parodies, quotations or educational uses, which are allowed under European copyright law, must not be deleted in the process, there is no technology available that is able to reliably tell the difference between a copyright infringement and a legal parody. Until Artificial Intelligence develops a sense of humor, that is not going to change, either. There are already scores of examples of upload filters such as YouTube’s ContentID or Facebook’s Rights Manager blocking perfectly legal content.
With the enforcement of Article 17, these types of mistakes are only going to become more common. Before mandating the use of such automated tools, policy-makers should draw lessons from the scientific evidence indicating significant chilling effects on free speech arising from the existing US notice-and-takedown system. Notice-and-takedown has become almost fully automated over the course of the last decade, with dedicated copyright enforcement companies automatically scraping the web for potential copyright infringements and sending thousands of automated takedown notices to online platforms every day. Confronted with huge numbers of notices, the platforms are more and more likely to automate the takedown process as well. It is obvious that a system where content is deleted without a human in the loop can easily be misused to make all kinds of unwelcome content silently disappear from the web.
An Evidence Base for Copyright Takedowns
The Lumen Project, a database of takedown notices run by the Berkman Klein Center, gives us an idea of the scope of the phenomenon. Lumen relies on platforms and rightsholders voluntarily sharing the takedown notices they send and receive, so the true numbers are significantly higher, yet Lumen alone has cataloged about 12 million takedown notices so far. In a recent statement to the European Commission, Lumen gave an overview of the research that has been done using the data on these mostly automated takedowns. Studies show that about ten percent of copyright takedowns likely target legal uses under exceptions and limitations to copyright or fair use, that companies frequently claim copyrights that they do not own, that copyright claims were often used to address completely different issues such as defamation or allegedly inaccurate information, that a significant number of libel-related takedown notices relied on falsified court orders, and that the chilling effects on legal speech arising from receiving takedown notices disproportionately affect women. These academic findings substantiate anecdotal evidence of automated filtering systems censoring legal speech by marginalized groups, small independent artists and human rights activists. The Electronic Frontier Foundation collects the most egregious examples of wrongful takedowns in its Takedown Hall of Shame.
When platforms are forced to use fully automated upload filters, the problems are likely worse than with the existing notice-and-takedown system, because there is no public accountability through databases such as Lumen. Making these filtering systems mandatory is also going to further entrench the powerful position of the largest tech companies, who are capable of building the most sophisticated proprietary filtering systems. While the benefits of Article 17 for artists are questionable because the automated filters tend to privilege large rightsholders and limit the ability for authors to creatively build upon the works of others, their main economic effect is likely to be to give a boost to the content filtering technology industry — hardly what the EU lawmakers had been hoping for.
A Law Full of Contradictions
The protests against Article 17, mostly driven by young people and organized online, were not in vain. In the final days of the negotiations, the European institutions hastily introduced some safeguards into the text aimed at limiting the negative impact of Article 17 on regular internet users and publishing opinion pieces trying to convince the enraged public that their memes are not in danger. The exceptions for caricature, parody, and quotation, which were previously voluntary for EU countries to implement in their national law, were turned into mandatory users’ rights. Platforms were required to respect these exceptions and limitations and were barred from overwriting them through their terms of service. The ban on general obligations for platforms to monitor all user activity, a mainstay of the old notice-and-takedown regime, was introduced into Article 17. And perhaps most importantly, the requirement was added that the new safe harbor must not lead to the deletion of legal content.
These safeguards, as important as they are, have only led to more confusion among the national lawmakers who are now tasked with implementing the new rules into their national copyright laws by the summer of 2021. How can platforms be forced to filter out infringements without monitoring all user uploads? That is like asking them to find a needle in a haystack without looking at all the hay. How are platforms supposed to ensure that legal content stays online while doing everything they can to delete the legal content? Wrongful copyright claims are commonplace and legal uses are governed by a maze of national copyright exceptions that are often different from one European country to the next.
Where the obligations to protect rightsholders and those to protect users are in clear conflict, national legislators have to make a choice. France has already presented its proposal for national implementation of Article 17, which simply ignores all the mandatory user rights included in the directive. Such a radical approach is likely to violate European law, but until these questions are litigated, other countries may follow France’s example. And across the pond, entertainment companies are invariably going to present this extreme reading of the new rules as a blueprint for the future of US copyright enforcement, “to stay competitive with the EU”. By the time Article 17 is in force and its devastating effects on online freedom become more visible, it may be too late to stop its proliferation.