Notice and take down (Redirected from Takedown notice)

Notice and take down is a process operated by online hosts in response to court orders or allegations that content is illegal. Content is removed by the host following notice. Notice and take down is widely operated in relation to copyright infringement, as well as for libel and other illegal content. In United States and European Union law, notice and takedown is mandated as part of limited liability, or safe harbour, provisions for online hosts (see the Digital Millennium Copyright Act 1998 and the Electronic Commerce Directive 2000). As a condition for limited liability online hosts must expeditiously remove or disable access to content they host when they are notified of the alleged illegality.

United States

The Online Copyright Infringement Liability Limitation Act, passed into law in 1998 as part of the Digital Millennium Copyright Act provides safe harbour protection to "online service providers" for "online storage" in section 512(c). Section 512(c) applies to online service providers that store copyright infringing material. In addition to the two general requirements that online service providers comply with standard technical measures and remove repeat infringers, section 512(c) also requires that the online service providers: 1) do not receive a financial benefit directly attributable to the infringing activity, 2) are not aware of the presence of infringing material or know any facts or circumstances that would make infringing material apparent, and 3) upon receiving notice from copyright owners or their agents, act expeditiously to remove the allegedly copyright infringing material.

An online service provider can be notified through the copyright owner's written notification of claimed infringement. Section 512(c) lists a number of requirements the notification must comply with, including:

  • Identification of the copyrighted work claimed to have been infringed and information reasonably sufficient to permit the service provider to locate the material.
  • Information reasonably sufficient to permit the service provider to contact the complaining party, such as an address, telephone number and email address
  • A statement that the complaining party has a good-faith belief that use of the material in the manner complained of is not authorized by the copyright owner, its agent, or the law.
  • A statement that the information in the notification is accurate, and under penalty of perjury, that the complaining party is authorized to act on behalf of the owner of an exclusive right that is allegedly infringed.

Provided the notification complies with the requirements of Section 512, the online service provider must expeditiously remove or disable access to the allegedly infringing material, otherwise the provider loses its safe harbour and is exposed to possible liability.

The online service provider may additionally limit its liability for the removal of the material itself as well as its liability for restoring the removed material, by complying with a counter notification process. In this process, the service provider must promptly inform the subscriber of the removal of the content. If the subscriber then objects via a counter notification, the service provider must notify the party which filed the original notice. If the party does not bring a lawsuit against the subscriber within 14 days, the service provider must then restore the material to its location on its network.

Like the original notification, the counter notification include specific elements:

  • The subscriber's name, address, phone number and physical or electronic signature.
  • Identification of the material and its location before removal.
  • A statement under penalty of perjury that the material was removed by mistake or misidentification.
  • Subscriber consent to local federal court jurisdiction, or if overseas, to an appropriate judicial body.

Implementing a counter notification process is not a requirement for the safe harbor protections. A service provider may decline to restore the allegedly infringing material, or to notify the subscriber at all, limiting the recourse available to the subscriber.

If the court determines that the copyright owner misrepresented the claim of copyright infringement, the copyright owner becomes liable for any damages that resulted to the online service provider from the improper removal of the material. The online service provider is also required to appropriately respond to "repeat infringers", including termination of online accounts. On this basis online service providers may insert clauses into user service agreements which allow them to terminate or disable user accounts following repeat infringement of copyright. Identification of "repeat infringer" may occur through repeated notice and takedown requests, while other online service provider require a determination by a court.

European Union

The basis for notice and takedown procedures under EU law is article 14 of the Electronic Commerce Directive, adopted in 2000. Article 14 applies to content hosts in relation to all "illegal activity or information". Online hosts are not liable for the illegal activity, or information placed on its systems by a user, so long as the online host does not have "actual knowledge" of the activity or information. Upon obtaining such knowledge, the online host must act expeditiously to remove or to disable access to the information. The Directive does not set out notice and takedown procedures but it envisaged the development of such a process because online hosts who fail to act expeditiously upon notification lose limited liability protection. The Directive suggests that voluntary agreements between trade bodies and consumer associations could specify notice and takedown processes, and that such initiatives should be encouraged by member states.

In most EU countries at the national level, there are no explicit rules regarding notice of infringement, take-down process or counter notice and put back (statutory rules exist in smaller countries like Hungary and Finland). Where explicit rules do not exist (e.g. Germany), some aspects of notice requirements can be derived from common principles of law. By nature, this lack of explicit rules results in a lack of clarity and legal certainty when compared to legal regimes with statutory rules (e.g. United States).

In October 2013, the European Court of Human Rights ruled in the Delfi AS v. Estonia case that the Estonian news website Delfi was liable for defamatory comments by users in an article. The court stated that the company "should have expected offensive posts, and exercised an extra degree of caution so as to avoid being held liable for damage to an individual’s reputation" and its notice and take down comments moderation system was "insufficient for preventing harm being cause to third parties".

India

In India takedown requests can happen through Section 69A of Information Technology Act, 2000.

Criticism

Notice and takedown has been criticised for over-blocking or take down of non-infringing content. In 2001 the Electronic Frontier Foundation launched a collaborative clearinghouse for notice and takedown requests, known as Chilling Effects. Researchers have been using the clearinghouse to study the use of cease-and-desist demands, primarily looking at DMCA 512 takedown notices, but also non-DMCA copyright issues, and trademark claims. A 2005 study into the DMCA notice and take down process by Jennifer Urban and Laura Quilter from the Samuelson Law, Technology and Public Policy Clinic concluded that "some notices are sent in order to accomplish the paradigmatic goal of 512 – the inexpensive takedown of clearly infringing hosted content or links to infringing web sites". However, on the basis of data on such notices the study concluded that the DMCA notice and take down process "is commonly used for other purposes: to create leverage in a competitive marketplace, to protect rights not given by copyright (or perhaps any other law), and to stifle criticism, commentary and fair use". However, it is misleading to conclude that these problems do not arise under the E-Commerce Directive, which does not provide for a statutory notice and take-down procedure, since these chilling effects are a specific problem of provider liability as such.

In 2007 numerous US based online service providers hosting user generated content implemented content recognition technology to screen uploaded content for possible copyright infringement. These content ID systems, such as operated by YouTube, are outside the Digital Millennium Copyright Act mandated notice and takedown process. The Electronic Frontier Foundation, along with other civil society organisations published principles on user generated content, calling for the protection of legitimate use of copyright protected works, prior notification of the uploader before removal or the placement of ads on the content, use of the DMCA counter notice system, including reinstatement upon counter note and the failure of the copyright owner to bring a lawsuit.

The Electronic Commerce Directive, unlike the Digital Millennium Copyright Act, did not define so called notice and action procedures under article 14 of the Directive. Member states implemented diverging approaches on the duty to act expeditiously and on when an online host obtains "actual knowledge" in relation to notifications. Inconsistent approaches to whether online service providers, such as search engines or social media networks, fall within the definition of online host, under article 14 developed across the EU. As a result, notice and takedown procedures are fragmented across EU member states and online hosts face considerable legal uncertainty. The European Commission consulted on notice and action procedures under article 14 in 2010, and has launched a new initiative in June 2012. The European Commission observed that "Online intermediaries face high compliance costs and legal uncertainty because they typically have operations across Europe, but the basic rules of Article 14 are interpreted in different ways by different national courts (sometimes even within the same member state)." As part of the initiative the European Commission intends to clarify which online service providers fall within the article 14 definition of online hosts. The initiative assesses whether different categories of illegal content require different notice and action approaches. It seems that in 2013 the European Commission's notice and action initiative has come to a halt. The reason for this is unclear. One aspect might be to avoid bad publicity, since notice and take down is associated with chilling effects on free speech as described above. The other reason might be the following problem: the EU Commission already made it quite clear that it does not want to change the Electronic Commerce Directive – while indeed it seems impossible to provide legal certainty in the take down process without a binding legal underpinning.

Notice and stay down

The term notice and stay down is used to refer to the concept of additionally requiring that a service, after it has received a request to take down a certain copyrighted work, must also prevent the same work from becoming available on the service again in the future. Proposals for such concepts typically prescribe the implementation of automatic content recognition, similar to YouTube's "Content ID" system, that would proactively filter identified works and prevent them from being re-uploaded. Proposals for notice and stay down rules have been made in the United States by pro-copyright lobbyists, and constitute Article 17 of the EU's Directive on Copyright in the Digital Single Market.

The concept of notice and stay down has faced criticism; it has been noted that the only way to reliably enforce such an obligation would be through automatic filtering, which is subject to the possibility of false positives, and the inability to detect lawful uses of an affected work (such as fair use). The Electronic Frontier Foundation argued that requiring proactive monitoring of user content would place the burden of copyright enforcement on service providers (thus defeating the purpose of safe harbors), and would be too costly for newly-established companies (thus bolstering incumbents and stifling innovation).

The implementation of Article 17 adopted by the German parliament includes safe harbour provisions intended to prevent false positives in situations "presumably authorised by law" (such as fair dealing rights), including that filters should not be applied automatically if an upload's use of copyrighted material is "minor" (defined as 160 characters of text, 125 kilobytes of image data, or video clips up to 15 seconds), in combination with other content, and using less than 50% of the original work. However, copyright holders may still oppose such use and issue takedowns, and providers must still provide "appropriate remuneration" to the copyright holder.

See also


This page was last updated at 2023-11-01 15:37 UTC. Update now. View original page.

All our content comes from Wikipedia and under the Creative Commons Attribution-ShareAlike License.


Top

If mathematical, chemical, physical and other formulas are not displayed correctly on this page, please useFirefox or Safari