1. Home
  2. |Insights
  3. |Notice and Action Mechanisms in the DSA – Balancing the Removal of Illegal Content and the Freedom of Expression

Notice and Action Mechanisms in the DSA – Balancing the Removal of Illegal Content and the Freedom of Expression

What You Need to Know

  • Key takeaway #1

    Online platforms and other hosting providers will have to adapt the design of their platform and their terms of service to allow users to notify the presence of “illegal content.”

  • Key takeaway #2

    Users whose content is restricted have the possibility to challenge the decision by the platform.

  • Key takeaway #3

    Platforms can take measures against the misuse of the notice and action mechanism.

Client Alert | 5 min read | 02.16.24

On February 17, 2024, The Digital Services Act (DSA) will become applicable, introducing a new regulatory framework for providers of intermediary services. The DSA will apply to those offering their services to users located in the EU, regardless of the providers' place of establishment. We have discussed the new obligations in our previous client alert, when the DSA was adopted. In this alert, we will focus on the notice and action mechanisms, the positions of the users, intermediaries and the general public.

The limited liability principle is well known: a “hosting provider” is not liable for the information stored at the request of the user of the hosting service, as long as it does not have actual knowledge of illegal activity or illegal content, it is not aware of any circumstances from which such illegal activities or content may be derived and, if it obtains such actual knowledge or awareness, it acts to remove or disable such illegal content (art. 6 DSA). The “illegal” nature may refer to any information that is not in compliance with EU law, national law or which refers to unlawful activities (art. 3(h) DSA). “Illegal content” may therefore cover a wide variety of information, such as ads for the sale of counterfeit products, racist statements, defamatory expressions or the unlawful disclosure of trade secrets.

This principle of limited liability was already well established in the E-commerce Directive, but the DSA imposes new obligations on the intermediaries. Hosting providers (such as cloud storage solutions), including platforms (such as social networks) must provide a “notice and action” mechanism for persons (predominantly their users) to report illegal content on the platform (art. 16 DSA). Illegal content includes content that is strictly illegal (e.g. defamation, incitement to violence, child pornography, racist or xenophobic content) but the hosting provider may also remove content that violates the platform's own terms and conditions.

The design of the service must be adapted to ensure it is easy for the users to access and use the notice mechanism. Specifically, the (electronic) notice form that the service provider makes available to the user should require the user to explain, in sufficient detail, why they believe the content is “illegal.” The user should provide the exact location of the content (e.g. by means of an URL), their name, address and they must also provide a confirmatory statement that the notice has been provided in good faith.

The hosting provider must then decide whether the content is illegal and, where applicable, which restrictions it will impose (i.e. restrictions of the visibility of the illegal content, restrictions of the use of monetary payments, restrictions to the provision of the hosting service or the user account). The same restrictions may be imposed for the non-compliance with the terms of service.

The transparency reports of the Very Large Online Platforms (“VLOPs”, i.e. platforms with an average monthly user count over 45 million within the European Union) show some interesting data on the notice and action mechanisms. Meta typically responds within 27.7 hours (median figure) of receiving a notification for Facebook. Most notices related to infringements of intellectual property rights (of which +/-30% leads to a measure) but it also receives notices of defamation (of which 15% is restricted) or violations of privacy (of which 19% is restricted). The briefing of the DSA transparency reports of the Slovak Council for Media Services and of the European Commission’s dashboard show that VLOPs limit content visibility rather than entirely removing it, and that they mainly target hate speech, nudity and sex, violence and incitement.

These restrictions have a considerable impact on the freedom of expression and information of the user, whose content is impacted, and also the general public. The hosting provider consequently must notify its decision to the user, by providing a statement of reasons whereby certain information must be included (e.g. the type of restriction and the territorial impact, the factual and circumstantial motivation and the legal grounds).

Notably, if the hosting provider fails to take the appropriate measures upon receiving the notice from which it is apparent – without detailed legal examination – that the content is illegal, it may lose the benefit of the limitation of liability (rec. 53 DSA).

The power of the service providers must not be underestimated: they can unilaterally restrict the expression of its users. Especially platforms that are used to disclose information to other users, they undoubtedly have a significant impact on the freedom of expression and information. The DSA grants the user the right to challenge the platform provider’s decision, using the internal complaint-handling mechanism or the out-of-court dispute settlement (art. 20-21 DSA).

The affected user has at least 6 months after becoming aware of the restrictions to file a complaint against the restriction through the platform’s internal complaint-handling system. The platform must then revise the restrictions in a "timely, non-discriminatory, diligent and non-arbitrary manner" (depending on the complexity of the complaint). The platform may reconsider or revoke its decision based on the arguments presented. Based on the reporting of some VLOPs, the Slovak Council for Media Services published that, on average, VLOPs make a decision within 24 hours. Facebook reported that it reinstates about 30% of removed content when the user object.

The user may also challenge the platform’s decision before a certified out-of-court dispute resolution body. The settlement decisions are not binding, and users may challenge them in court.

Finally, platform providers can sanction users who abuse the notice and action mechanisms. They must suspend users who “frequently” provide “manifestly illegal content” and suspend users who “frequently” submit notices or complaints that are “manifestly unfounded” (art. 23 DSA).

The DSA attempts to establish a balance between the efficient removal of illegal content and the freedom of expression and information of the platform users and the general public. It will however remain to be seen how, in a year with many important elections worldwide, platforms will deal with disinformation and AI-generated deepfakes that may be deceptive and harmful but not (prior to the adoption of the AI Act) blatantly “illegal”. All will depend on the robustness of the various transparency instruments, the accessibility and effectiveness of the redress mechanisms and the ability of the platforms to adapt their decision-making processes to achieve a fair balance between the rights and interests of the various stakeholders.

Insights

Client Alert | 8 min read | 12.20.24

End of Year Regulations on Interoperability

Federal policy efforts to advance health data exchange and interoperability are continuing to change rapidly. The latest changes are the publication of two final rules by the Assistant Secretary for Technology Policy/Office of the National Coordinator for Health Information Technology (ASTP/ONC) finalizing parts of the of the Health Data, Technology, and Interoperability (HTI-2) Proposed Rule. These rules adopt requirements regarding the Trusted Exchange Framework and Common Agreement (TEFCA) (HTI-2 Part 1), and create a new Information Blocking exception under Protecting Care Access (HTI-2 Part 2), on December 16th and 17th, respectively....