Article 9

Orders to act against illegal content

1. National judicial or administrative authorities shall have the power to order providers of intermediary services to act against one or more specific items of illegal content or to provide specific information about one or more specific individual recipients of the service.

2. Orders shall:

(a) refer to the specific illegal content item(s) by providing the exact URL and, where necessary, additional information;

(b) include a statement of reasons explaining why the content is illegal;

(c) where applicable, include information about redress available to both the provider and the affected recipient;

(d) be sent to the electronic point of contact designated pursuant to Article 11 or, in the absence of such designation, by other means.

3. Orders shall be clear, precise and duly reasoned. Where an order is addressed to providers offering services in several Member States, the issuing authority shall transmit a copy to Digital Services Coordinators in all relevant Member States.

Understanding This Article

Article 9 of the Digital Services Act establishes the legal framework enabling national judicial and administrative authorities to issue binding orders requiring intermediary service providers to act against specific illegal content or provide information about specific service recipients. This provision creates a critical enforcement mechanism ensuring that when voluntary cooperation fails or legal determinations of illegality occur, authorities possess effective tools to require provider action regardless of liability exemptions under Articles 4, 5, and 6.

The constitutional and procedural framework of Article 9 carefully balances enforcement effectiveness with fundamental rights protections. Orders must meet rigorous requirements: they must identify specific illegal content with precision (exact URLs and necessary additional information), include comprehensive statements of reasons explaining why content is illegal under applicable law, provide information about available redress mechanisms for both providers and affected users, and be transmitted through designated points of contact (Article 11) or alternative means if contacts aren't designated. These requirements ensure orders are transparent, legally justified, and subject to challenge, preventing arbitrary or overly broad censorship.

The specificity requirement serves multiple functions. First, it operationalizes the Article 8 prohibition on general monitoring by ensuring orders target identified illegal content rather than requiring platforms to scan for potential illegality. An order stating 'remove all hate speech' would violate Article 9's specificity requirement; an order identifying specific posts by URL with legal explanations is compliant. Second, specificity enables providers to assess legality and compliance obligations. Third, it facilitates affected users' ability to understand and challenge removal of their content. Fourth, it creates accountability by requiring authorities to do the work of identifying illegal content rather than outsourcing that determination to private platforms.

Article 9 applies to all categories of intermediary services, reflecting that even services benefiting from liability exemptions must comply with lawful orders from competent authorities. A hosting provider with Article 6 protections cannot refuse a valid court order by claiming the exemption shields it from action requirements. The exemptions protect against liability for user content absent knowledge or awareness; they don't protect against compliance with specific judicial or administrative determinations of illegality. This distinction preserves the liability framework while ensuring legal orders can be enforced.

The cross-border dimension adds complexity and importance. Recital 29 emphasizes that when an order is addressed to providers offering services in several Member States, the issuing authority must transmit a copy to Digital Services Coordinators in all relevant Member States. This notification ensures coordinated oversight, prevents conflicting orders, enables affected Member States to monitor compliance with fundamental rights and EU law principles, and creates transparency in cross-border enforcement. For example, if an Irish court issues an order affecting content on a platform serving all EU Member States, Ireland's DSC must notify other Member State DSCs, enabling them to assess whether the order respects freedom of expression, proportionality, and EU law requirements applicable in their jurisdictions.

The statement of reasons requirement is particularly significant. Authorities must explain not just that content is illegal, but why it violates specific legal provisions. This forces legal analysis, enables judicial review, allows providers to assess legality (important if providers consider contesting orders), and protects against pretextual or politically motivated orders disguised as legal enforcement. For instance, an order must cite specific defamation law provisions, explain how content meets legal elements of defamation, and demonstrate why removal is proportionate rather than simply asserting 'this is illegal defamation, remove it.'

Redress information requirements ensure procedural justice. Both providers and affected content creators must be informed of how to challenge orders - which courts have jurisdiction, applicable appeal procedures, time limits for challenges, etc. This enables meaningful judicial review and protects against irreversible rights violations. If a platform believes an order violates EU law or a user believes their content was wrongfully ordered removed, both must have practical means to seek legal remedies.

The relationship between Articles 9 and 10 is complementary. Article 9 addresses orders to act against content (removal, disabling access); Article 10 addresses orders to provide information about users. Both establish frameworks for authority orders but with different scope and safeguards reflecting the different nature of the required actions. Together, they create a comprehensive legal architecture for authority-provider interaction in content governance.

Key Points

  • National judicial and administrative authorities have power to order providers to act against specific illegal content
  • Orders must identify exact content through precise URLs and additional information necessary for identification
  • Must include comprehensive statement of reasons explaining why content violates applicable law
  • Must provide information about redress mechanisms available to providers and affected users
  • Orders must be sent to Article 11 designated points of contact or alternative means if not designated
  • Cross-border orders require notification to Digital Services Coordinators in all relevant Member States (Recital 29)
  • Applies to all intermediary service categories regardless of liability exemption status
  • Specificity requirement operationalizes Article 8's prohibition on general monitoring
  • Statement of reasons enables judicial review, provider assessment, and protection against arbitrary orders
  • Providers can challenge orders believed to violate EU law, proportionality, or fundamental rights
  • Affected content creators have rights to information and legal remedies against removal orders
  • Framework adaptable to different content urgency levels (immediate CSAM removal vs. standard defamation procedures)

Practical Application

For Defamation Cases: A German court determines a Facebook post contains illegal defamation under German Civil Code Section 186. The court issues an Article 9 order requiring Facebook to remove the post. The order must include: (1) the exact URL of the defamatory post; (2) citation of BGB ยง186 and explanation of how the post's content meets the legal definition of defamation; (3) findings that the post is not protected by opinion privileges or public figure defamation standards; (4) information that Facebook can appeal to [specified appellate court] within [specified timeframe]; (5) information that the affected user can challenge the order through [specified procedures]. Facebook must comply with this valid order regardless of Article 6 hosting protections. If Facebook believes the order violates EU law (e.g., disproportionately restricts freedom of expression), it can appeal while typically required to comply pending appeal.

For Cross-Border Order Coordination: A French authority issues an order requiring YouTube to remove videos containing illegal Holocaust denial under French Penal Code provisions. Because YouTube operates across the EU, France's Digital Services Coordinator must transmit copies of the order to DSCs in all other Member States. German DSC receives notification and reviews the order. Germany has no Holocaust denial law criminalization but does have related provisions under Section 130 StGB. German DSC assesses whether the order's territorial scope (EU-wide removal) complies with fundamental rights principles, given that content legal in some Member States might be removed. This coordination enables oversight of cross-border orders' compliance with EU law principles.

For Illegal Drug Sales: Belgian authorities identify specific Instagram posts advertising illegal drug sales with delivery in Brussels. Police obtain a court order requiring Instagram to remove the specific posts (identified by post IDs/URLs). The order includes: (1) precise identification of each illegal post; (2) citation of Belgian drug laws violated; (3) evidence showing the posts constitute drug trafficking advertisement; (4) redress information. Instagram receives the order at its Article 11 point of contact, must comply, and should inform the posting user of the removal and appeal rights. This is straightforward Article 9 application: specific illegal content, legal determination, precise identification, procedural safeguards.

For CSAM Removal Orders: Child Sexual Abuse Material creates urgent removal requirements. When authorities identify specific CSAM on a hosting platform, they can issue immediate Article 9 orders requiring removal. For example, Irish police identify CSAM images on a file-sharing service operating in Ireland. They issue an order requiring immediate removal, providing: (1) hash values or URLs identifying specific illegal images; (2) citation of criminal laws prohibiting CSAM; (3) expedited procedures reflecting content urgency. The hosting provider must comply immediately. Due to CSAM's clear illegality and severe harm, challenges to such orders are rare, but procedural protections remain available if a provider believed material was wrongly classified or hash values misidentified content.

For Terrorist Content: The Terrorist Content Online Regulation creates one-hour removal obligations for terrorist content identified by competent authorities. These orders function within Article 9's framework but with accelerated timelines reflecting urgency. For example, Austrian authorities identify a terrorism recruitment video on Telegram. They issue a TCO Regulation removal order, which must still comply with Article 9 requirements: specific identification, legal basis, redress information. However, Telegram must act within one hour rather than the longer timeframes typical for other content types. This demonstrates how Article 9 provides a framework adaptable to different illegal content categories' urgency levels.

For Intellectual Property Infringement: A copyright holder obtains a court determination that specific videos on Vimeo infringe their exclusive rights. The court issues an Article 9 order requiring Vimeo to remove the infringing videos and prevent their re-upload. The order must: (1) identify each infringing video by URL; (2) cite copyright laws violated; (3) explain why videos infringe (e.g., unauthorized reproduction of protected cinematographic work); (4) specify measures to prevent re-upload (e.g., hash-based filtering for identical copies); (5) provide redress information. The order cannot require general scanning of all uploads for potential infringement (that would violate Article 8) but can require specific measures preventing the identified infringing content's re-appearance.

For Misleading Advertising: A national consumer protection authority determines a series of TikTok posts by an influencer contain illegal misleading advertising under Consumer Protection Laws. The authority issues an administrative order requiring TikTok to remove the specific posts. The order includes: (1) identification of each misleading post; (2) citation of consumer protection law provisions; (3) explanation of how posts mislead consumers (e.g., undisclosed sponsored content, false claims); (4) administrative appeal procedures. TikTok can challenge the illegality determination while typically required to comply pending resolution. The affected influencer can also challenge, arguing the content wasn't misleading or was protected commercial speech.

For Privacy Violations: A data protection authority determines that specific posts on a forum contain unlawful processing of personal data (e.g., doxxing - posting someone's address, phone number, personal information without consent). The DPA issues an order under GDPR enforcement powers, which can be structured as Article 9 DSA order requiring the forum to remove the specific privacy-violating posts. The order must identify specific posts, cite GDPR provisions (e.g., Article 6 lack of lawful basis, Article 9 special category data violations if applicable), explain why processing is unlawful, and provide redress mechanisms. This demonstrates Article 9's function across different legal regimes - copyright, criminal law, consumer protection, data protection - all can utilize the Article 9 order framework.

Real-World Challenge - Territorial Scope Disputes: Austria issues an order requiring Facebook to remove defamatory content and prevent its access globally (2019 case predating DSA but relevant to Article 9's framework). The CJEU held that EU law doesn't preclude worldwide removal orders if proportionate under national law and fundamental rights. Under the DSA's Article 9, such orders must comply with proportionality and fundamental rights principles. If an Italian court orders Twitter to remove content globally, Twitter could challenge whether global scope is proportionate, arguing territorial limitation to Italy or the EU would suffice. Article 9's requirements for reasoned orders and redress information enable such challenges, ensuring territorial scope doesn't exceed what's necessary and lawful.