On December 22, 2020, Senator Thom Tillis released the first discussion draft of the Digital Copyright Act of 2021 (“DCA” or the “Act”): an amendment to the Digital Millennium Copyright Act (“DMCA”). Senator Tillis also provided a one-page summary of the background of the Act and its significant revisions to the DMCA. It is important to note that the discussion draft is just that, a platform for discussion.

Senator Tillis’ significant revisions include a major change to the DMCA’s safe harbor provision in three parts. First, “lowering the specificity with which copyright owners must identify infringing material in certain circumstances.” This revision is likely made in reference to the Act’s proposed Section 2(b)(3), which creates an option for online service providers to enter into private agreements with copyright holders in an “alternative noticing process” to supplement the existing notice system. Second, implementing a notice-and-stay down system for “complete and near complete” or (when the online service provider primarily deals in short-form media) “any portion of a copyrighted work already identified” in a prior notice or list of unauthorized works. Third, allowing notice to be generated by automated rights management systems. The Act appears to legitimize automated rights management systems, such as YouTube’s Content ID system, as valid notice generators under the law, and requires that content previously taken down must not be reuploaded.

Content ID is the product of over $100 million in development and serves an industry-leading role in automated rights management systems. It is, therefore, an appropriate exemplar of the current state of automated rights management systems as would be permitted by the Act. The Content ID system works by scanning uploaded content for matches to a database of files provided by copyright holders. Once a match has been established, the uploaded content is flagged with a Content ID claim and the copyright holder is able to either block, monetize (share in ad revenue), or track viewership statistics from the video. The content creator is then able to accept the copyright holder’s decision or dispute the claim. If disputing the claim, the copyright holder may either release the claim or dispute the content creator’s dispute. If no resolution is found at the end of this back and forth of disputes, the copyright holder may send a DMCA notice and the content will be taken down by the online service provider.

Just before Senator Tillis’ discussion draft was released, Katharine Trendacosta’s article commented on the unreliability of YouTube’s Content ID system, highlighting its inability to distinguish between copyright ownership when media is licensed and enters the database multiple times (resulting in multiple flags for the same content); disregard for potential fair use; and anecdotal evidence of the opaque rules operating the system. Trendacosta’s article is just one example of criticism against the Content ID system. The YouTube channel littlescale uploaded a ten-hour video of white noise generated by the user’s own program that triggered five Content ID claims. A German professor of music theory received Content ID Claims for recordings in the public domain uploaded by his YouTube channels. The YouTube channel WatchMojo uploaded a three-part series criticizing the Content ID system and estimating that over $2 billion was unlawfully claimed via the system between 2014 and 2019 (while a 2018 report from Google found that Content ID was responsible for over $3 billion in ad revenue being redirected to copyright holders in total). While a smattering of examples is not proof that the Content ID system is flawed, the persistent criticism suggests a need for further research into the accuracy of any automated rights management system before it is able to serve as an “alternative noticing process.”

Beyond notice, the criticisms of automated rights management systems are equally present in the DCA’s “stay down” provision. “Stay down” is accomplished by adding an additional requirement to the DMCA’s take down approach, where all content is filtered through a database of copyrighted content prior to its upload and prevented from being uploaded if a match is found. Differing from the prior discussion on Content ID in the notice system, the draft Act does not allow for an exchange between copyright holder and content creator, such that the uploaded content is either preemptively blocked or the “stay down” upload rejection is challenged by the content creator.

The dichotomy of the block or challenge setup implicates another change to the DMCA; namely, its provision on an online service provider’s eligibility for the safe harbor and repeat offenders. Trendacosta’s article suggests the DMCA’s repeat offender provision is the threat underlying Content ID. Content creators often refrain from disputing Content ID claims because of the burden in responding through a complicated dispute process and a fear of their channels being terminated under the DMCA’s repeat infringer policy if multiple disputes are lost. However, the DCA’s draft form places greater burden on content creators by removing “repeat infringers” and replacing it with “persons that, on multiple occasions, were the subject of notifications . . . that were not successfully challenged;” creating a presumption of infringement if a notification is made.

While the current Content ID system does not penalize content creators who are repeatedly flagged but agree with the copyright holder’s decision because a DMCA notice was not generated, the DCA would allow an automated rights management system’s claim to supplant the notice—skipping the dispute process where a content creator may concede either monetization or tracking—and, much like the “stay down” provision, results in the content being blocked or requiring a challenge. However, because a notification is presumed to infringe and may lead to a termination of the content creator’s account under the revised repeat infringer provision, the content creator is now forced to challenge each claim.

As of February 2020, more than 500 hours of video is uploaded to YouTube every hour and, even in light of criticism over its effectiveness, the Content ID system is very likely essential to performing an initial sorting of potentially infringing material. This initial sorting function, however, is balanced by the ability of copyright holders to step in and allow content to remain available despite being flagged. The DCA, in its draft form, does not account for the inaccuracies of current automated rights management systems and removes a built-in safeguard by instituting a block or challenge regime.

One thought on “Creating a Safer Harbor Under the DMCA”

Comments are closed.