Chapter 4|Competent Authorities and National Digital Services Coordinators|📖 33 min read
ARTICLE 51 FULL TEXT - Powers of Digital Services Coordinators
1. Where needed in order to carry out their tasks under this Regulation, Digital Services Coordinators shall have the following powers of investigation, in respect of conduct by providers of intermediary services falling within the competence of their Member State:
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including organisations performing the audits referred to in Article 37 and Article 75(2), to provide such information without undue delay;
(b) the power to carry out, or to request a judicial authority in their Member State to order, inspections of any premises that those providers or those persons use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement in any form, irrespective of the storage medium;
(c) the power to ask any member of staff or representative of those providers or those persons to give explanations in respect of any information relating to a suspected infringement and to record the answers with their consent by any technical means.
2. Where needed for carrying out their tasks under this Regulation, Digital Services Coordinators shall have the following enforcement powers, in respect of providers of intermediary services falling within the competence of their Member State:
(a) the power to accept the commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding;
(b) the power to order the cessation of infringements and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end, or to request a judicial authority in their Member State to do so;
(c) the power to impose fines, or to request a judicial authority in their Member State to do so, in accordance with Article 52 for failure to comply with this Regulation, including with any of the investigative orders issued pursuant to paragraph 1 of this Article;
(d) the power to impose a periodic penalty payment, or to request a judicial authority in their Member State to do so, in accordance with Article 52 to ensure that an infringement is terminated in compliance with an order issued pursuant to point (b) of this subparagraph or for failure to comply with any of the investigative orders issued pursuant to paragraph 1 of this Article;
(e) the power to adopt interim measures or to request the competent national judicial authority in their Member State to do so, to avoid the risk of serious harm.
As regards the first subparagraph, points (c) and (d), Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after providing those other persons in good time with all relevant information relating to such orders, including of the applicable penalties, the possibility of an effective judicial remedy and the right to be heard.
3. Where the Digital Services Coordinator considers that: (a) a provider of intermediary services has not sufficiently complied with an order referred to in paragraph 2, point (b), or that the infringement which gave rise to the order has not been remedied; (b) the infringement entails a criminal offence involving a threat to the life or safety of persons; and (c) given the urgency of the situation, the exercise of the powers set out in paragraphs 1 and 2 would not be effective in bringing the infringement to an end,
it may request the competent judicial authority of its Member State to order: (a) the service provider's management body to examine the situation, to adopt and submit an action plan, setting out the measures necessary to bring the infringement to an end and setting out the timeline for implementing those measures, and to adopt and submit regular reports on the implementation of those measures; or (b) the temporary restriction of access by recipients of the service to the specific content affected by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.
The Digital Services Coordinator shall, except where it acts upon the Commission's request referred to in Article 82, prior to submitting the request referred to in the first subparagraph, point (b), of this paragraph, invite interested parties to submit written observations within a period that shall not be less than two weeks, describing the measures that it intends to request and identifying the intended addressee or addressees thereof. The provider of intermediary services, the intended addressee or addressees and any other third party demonstrating a legitimate interest shall be entitled to participate in the proceedings before the competent judicial authority. Any measure ordered shall be proportionate to the nature, gravity, recurrence and duration of the infringement, without unduly restricting access to lawful information by recipients of the service concerned.
The restriction of access shall be for a period of four weeks, subject to the possibility for the competent judicial authority, in its order, to allow the Digital Services Coordinator to extend that period for further periods of the same length, subject to a maximum number of extensions set by that judicial authority. At the end of each period, the judicial authority shall review the situation and may, at the request of an interested party demonstrating a legitimate interest, or at the request of the competent judicial authority of another Member State, order the revocation of the measure if the conditions for it are no longer met or if that measure is not properly justified.
Where the Digital Services Coordinator considers that the conditions set out in the third subparagraph, points (a) and (b), have been met but it cannot further extend the period pursuant to the fifth subparagraph, it shall submit a new request to the competent judicial authority, as referred to in the first subparagraph, point (b).
4. The powers listed in paragraphs 1, 2 and 3 shall be without prejudice to Section 3.
5. The measures taken by the Digital Services Coordinators in the exercise of their powers listed in paragraphs 1, 2 and 3 shall be effective, dissuasive and proportionate, having regard, in particular, to the nature, gravity, recurrence and duration of the infringement or suspected infringement to which those measures relate, as well as the economic, technical and operational capacity of the provider of the intermediary services concerned where relevant.
6. Member States shall lay down specific rules and procedures for the exercise of the powers pursuant to paragraphs 1, 2 and 3 and shall ensure that any exercise of those powers is subject to adequate safeguards laid down in the applicable national law in compliance with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties.
Understanding This Article
Article 51 constitutes the enforcement backbone of the DSA at the national level, establishing comprehensive investigative and enforcement powers for Digital Services Coordinators. The provision creates a graduated enforcement framework escalating from information requests through fines and periodic penalties to the extraordinary measure of temporary access restriction for platforms threatening life or safety. This power structure mirrors competition law enforcement (EU Merger Regulation, antitrust investigations) and data protection enforcement (GDPR Article 58), while incorporating novel elements specific to digital services including the exceptional power to request judicial orders restricting platform access—a measure unprecedented in EU digital regulation requiring careful balancing of enforcement effectiveness against fundamental rights protection.
Investigation Powers (Paragraph 1): DSCs possess three core investigative powers exercisable 'where needed in order to carry out their tasks': (1) Information requests (paragraph 1(a)) - DSCs can 'require' providers and 'any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement' to provide information 'without undue delay.' Scope includes: providers of intermediary services (platforms, hosting services, caching services, mere conduit providers), third parties with relevant knowledge (advertisers, payment processors, cloud infrastructure providers serving platforms, business partners), audit organizations conducting Article 37 (VLOP audits) and Article 75(2) audits. Information must be provided 'without undue delay'—typically interpreted as days to weeks depending on complexity, not months. Providers cannot refuse citing business confidentiality, proprietary information, or competitive sensitivity, though procedural safeguards (paragraph 6) protect against unreasonable requests.
(2) Premises inspections (paragraph 1(b)) - DSCs can 'carry out, or request a judicial authority in their Member State to order, inspections of any premises that those providers or those persons use for purposes related to their trade, business, craft or profession.' This encompasses: offices, data centers, operational facilities used for platform services, contractors' premises if relevant to suspected infringement. Inspections enable DSCs to: 'examine, seize, take or obtain copies of information relating to a suspected infringement in any form, irrespective of the storage medium'—paper documents, computer files, databases, emails, internal communications, algorithmic code, content moderation guidelines, training materials. The 'or to request other public authorities to do so' language acknowledges some Member States may assign inspection powers to specialized authorities (police, tax inspectors, regulatory investigators) who act on DSC request.
(3) Staff interviews (paragraph 1(c)) - DSCs can 'ask any member of staff or representative of those providers or those persons to give explanations in respect of any information relating to a suspected infringement and to record the answers with their consent by any technical means.' Critically, consent is required—DSCs cannot compel testimony, but refusal to cooperate may constitute obstruction triggering enforcement powers. Interviews might involve: content moderators explaining moderation procedures, algorithm designers explaining recommender system operation, compliance officers explaining internal DSA implementation, executives explaining corporate decision-making about controversial content policies.
These investigation powers parallel GDPR Article 58(1) data protection authority investigative powers and EU competition law investigation powers (Regulation 1/2003 Article 20). However, DSA investigation powers are broader than GDPR in some respects—covering third parties 'reasonably aware of information' not just data controllers/processors—reflecting the distributed nature of platform ecosystems where multiple actors contribute to DSA compliance or violations.
Enforcement Powers (Paragraph 2): DSCs possess five graduated enforcement powers: (1) Binding commitments (paragraph 2(a)) - 'power to accept the commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding.' Modeled on competition law commitment decisions (Regulation 1/2003 Article 9), this enables negotiated settlements where providers voluntarily commit to specific remedial actions avoiding formal infringement findings and penalties. Example: Platform suspected of inadequate notice-and-action mechanisms (Article 16) might commit to: implementing specific technical improvements to reporting interface, hiring additional content moderation staff, providing monthly reporting to DSC on notice processing times, maintaining improvements for specified period (e.g., 2 years). Commitments become legally binding; violation constitutes separate infringement triggering fines/periodic penalties. Advantages: faster resolution than contested proceedings, tailored remedies addressing specific concerns, provider buy-in improving implementation. Risks: potential under-enforcement if commitments are too weak, reduced transparency compared to formal decisions, negotiated compliance potentially creating disparate treatment.
(2) Cessation orders and remedies (paragraph 2(b)) - 'power to order the cessation of infringements and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end, or to request a judicial authority in their Member State to do so.' This core enforcement power requires providers stop violating DSA and implement measures ensuring non-recurrence. Cessation orders might require: implementing missing transparency reporting (Article 15), establishing compliant complaint mechanisms (Article 20), designating required points of contact (Article 11), modifying terms of service violating Article 14 requirements, implementing risk mitigation measures for VLOPs (Article 35). Remedies go beyond mere cessation, imposing affirmative obligations: 'necessary to bring the infringement effectively to an end'—if cessation requires structural changes (new systems, procedures, policies), DSCs can order their implementation. The 'or to request a judicial authority' alternative recognizes some Member States constitutionally require judicial orders for certain enforcement measures, protecting separation of powers and judicial review rights.
(3) Fines (paragraph 2(c)) - 'power to impose fines, or to request a judicial authority in their Member State to do so, in accordance with Article 52 for failure to comply with this Regulation, including with any of the investigative orders issued pursuant to paragraph 1.' Article 52 caps fines at 6% of global annual turnover for DSA violations—substantial deterrent for even largest platforms. Fines apply to: substantive DSA violations (non-compliance with Articles 11-47 obligations), non-compliance with investigative orders (refusing information requests, obstructing inspections, false or misleading information). Fines are retrospective penalties for past violations, distinct from periodic penalties (forward-looking compliance incentives).
(4) Periodic penalty payments (paragraph 2(d)) - 'power to impose a periodic penalty payment, or to request a judicial authority in their Member State to do so, in accordance with Article 52 to ensure that an infringement is terminated in compliance with an order issued pursuant to point (b) of this subparagraph or for failure to comply with any of the investigative orders issued pursuant to paragraph 1.' Article 52 caps periodic penalties at 5% of average daily worldwide turnover per day of non-compliance. These forward-looking penalties accumulate daily until compliance, creating powerful incentive: platform ignoring cessation order faces mounting penalties potentially reaching massive sums. Example: Platform ordered to implement compliant reporting mechanism (Article 16) by specific date but fails to comply faces €X daily penalty until implementation. Periodic penalties also apply to investigative order non-compliance—refusing information requests or access for inspections triggers daily accumulating penalties.
(5) Interim measures (paragraph 2(e)) - 'power to adopt interim measures or to request the competent national judicial authority in their Member State to do so, to avoid the risk of serious harm.' Interim measures are temporary precautionary actions taken before final enforcement decision when 'risk of serious harm' requires immediate action. Standard for interim measures: prima facie infringement (reasonable grounds to suspect DSA violation), risk of serious harm (significant damage to users, public interest, or rights if enforcement delayed), urgency (harm cannot wait for full investigation and final decision), proportionality (interim measures proportionate to suspected harm). Example: Platform suspected of systematically failing to remove child sexual abuse material (CSAM) might face interim order requiring: immediate implementation of PhotoDNA hash-matching, mandatory human review of all suspected CSAM reports within 2 hours, daily reporting to DSC on CSAM removals. Interim measures remain until final decision or until circumstances change eliminating serious harm risk.
Paragraph 2 second subparagraph extends fine and periodic penalty powers to third parties (auditors, business partners) failing to comply with information orders, but with procedural safeguards: 'They shall only exercise those enforcement powers after providing those other persons in good time with all relevant information relating to such orders, including of the applicable penalties, the possibility of an effective judicial remedy and the right to be heard.' This prevents surprise enforcement against third parties, ensuring fair process.
Access Restriction - The Nuclear Option (Paragraph 3): Paragraph 3 establishes the most extraordinary DSA enforcement power: requesting judicial orders for temporary platform access restriction. This 'nuclear option' reflects recognition that in extreme cases (terrorism, child exploitation, imminent violence), normal enforcement may be too slow to prevent grave harm, requiring immediate platform blocking. However, given fundamental rights implications (freedom of expression, information access), Article 51 imposes stringent conditions and safeguards.
Conditions for access restriction (paragraph 3 first subparagraph): All three conditions must be met: (a) Insufficient compliance with prior cessation order or ongoing unremedied infringement - access restriction only available after normal enforcement has failed. DSC must have issued paragraph 2(b) cessation order which provider insufficiently complied with, or infringement giving rise to order continues unremedied. This ensures access restriction is last resort, not first response. (b) Criminal offence involving threat to life or safety - infringement must 'entail a criminal offence involving a threat to the life or safety of persons.' Examples: terrorist content (recruitment, attack planning, glorification), incitement to imminent violence, child sexual abuse material, human trafficking content, content facilitating violent crimes. Not mere illegality—criminal offence with life/safety threat. This excludes: copyright infringement (illegal but no life threat), defamation (harmful but typically not life-threatening), most commercial violations. (c) Urgency and ineffectiveness of normal powers - 'given the urgency of the situation, the exercise of the powers set out in paragraphs 1 and 2 would not be effective in bringing the infringement to an end.' Normal enforcement (fines, periodic penalties, cessation orders) takes weeks or months—too slow when lives are at stake. Access restriction addresses situations where immediate action is necessary and normal enforcement cannot act fast enough.
Two types of access restriction orders (paragraph 3 first subparagraph): (a) Management action plan requirement - judicial authority orders 'service provider's management body to examine the situation, to adopt and submit an action plan, setting out the measures necessary to bring the infringement to an end and setting out the timeline for implementing those measures, and to adopt and submit regular reports on the implementation of those measures.' This compels senior management attention and accountability. Less intrusive than blocking, but may be insufficient for imminent threats. (b) Temporary access restriction - judicial authority orders 'temporary restriction of access by recipients of the service to the specific content affected by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.' Hierarchy of restrictions: First preference: block specific illegal content (e.g., specific terrorist videos, CSAM images, pages inciting violence). Least restrictive, targets harm precisely. Second preference (only if content-specific blocking 'not technically feasible'): block entire platform interface. Extraordinary measure given collateral impact on lawful expression and access. 'Not technically feasible' is narrow exception—most modern platforms can content-specifically block. Whole-platform blocking might apply to: platforms refusing to implement content-specific blocking capabilities, distributed/decentralized services where content-specific blocking technologically impossible, situations where identifying specific harmful content is impossible but platform-wide pattern clearly endangers lives.
Procedural safeguards for access restriction (paragraph 3 second-fifth subparagraphs): (1) Prior consultation (second subparagraph) - 'except where it acts upon the Commission's request referred to in Article 82, prior to submitting the request...invite interested parties to submit written observations within a period that shall not be less than two weeks, describing the measures that it intends to request and identifying the intended addressee or addressees thereof.' DSC must give minimum 2 weeks notice to affected parties describing proposed restriction before requesting judicial order. Exceptions: Commission-requested restrictions under Article 82 (cross-border enforcement), extreme emergency where 2-week delay would cause irreparable harm (though Article 51 doesn't explicitly provide emergency exception, national constitutional law might under urgent circumstances). (2) Judicial proceeding participation - 'provider of intermediary services, the intended addressee or addressees and any other third party demonstrating a legitimate interest shall be entitled to participate in the proceedings before the competent judicial authority.' Affected parties can present arguments, evidence, and alternatives to judicial authority. Ensures adversarial proceedings, not ex parte orders. (3) Proportionality assessment - 'Any measure ordered shall be proportionate to the nature, gravity, recurrence and duration of the infringement, without unduly restricting access to lawful information by recipients of the service concerned.' Judicial authority must assess: Is access restriction necessary and appropriate? Are less restrictive alternatives available? Does blocking achieve legitimate aims without excessive collateral harm to lawful expression? (4) Time limits and review - 'restriction of access shall be for a period of four weeks, subject to the possibility for the competent judicial authority, in its order, to allow the Digital Services Coordinator to extend that period for further periods of the same length, subject to a maximum number of extensions set by that judicial authority.' Initial order limited to 4 weeks. Extensions possible in 4-week increments, but judicial authority sets maximum extension number (e.g., max 2 extensions = 12 weeks total). Prevents indefinite blocking. 'At the end of each period, the judicial authority shall review the situation and may, at the request of an interested party demonstrating a legitimate interest, or at the request of the competent judicial authority of another Member State, order the revocation of the measure if the conditions for it are no longer met or if that measure is not properly justified.' Periodic judicial review ensures continued necessity. Affected parties or other Member State courts can request revocation.
Relationship to Commission Powers (Paragraph 4): 'The powers listed in paragraphs 1, 2 and 3 shall be without prejudice to Section 3.' Section 3 of Chapter IV comprises Articles 66-87 establishing Commission powers to enforce DSA against VLOPs/VLOSEs. Article 51 DSC powers coexist with Article 66+ Commission powers: DSCs enforce against all intermediary services in their Member State including VLOPs, while Commission has exclusive power for certain VLOP/VLOSE matters but shares others with DSCs. This creates potential for dual enforcement requiring coordination (Articles 56-60).
Proportionality Requirements (Paragraph 5): 'The measures taken by the Digital Services Coordinators in the exercise of their powers listed in paragraphs 1, 2 and 3 shall be effective, dissuasive and proportionate, having regard, in particular, to the nature, gravity, recurrence and duration of the infringement or suspected infringement to which those measures relate, as well as the economic, technical and operational capacity of the provider of the intermediary services concerned where relevant.' Enforcement must balance: Effectiveness - measures actually achieve compliance and remedy harms. Symbolic enforcement insufficient. Dissuasiveness - measures deter future violations by this provider and others. Deterrence requires penalties and remedies substantial enough that non-compliance is economically irrational. Proportionality - measures proportionate to violation severity. Factors: Nature (what DSA obligation violated? How fundamental?), Gravity (how serious was harm? Minor technical violation vs. widespread systemic failure?), Recurrence (first offense vs. repeated violations showing deliberate non-compliance?), Duration (one-time incident vs. ongoing systematic violation?), Provider capacity (small startup vs. tech giant? Resource constraints legitimately limiting compliance vs. deliberate non-compliance?). Proportionality prevents: crushing fines for minor technical violations by small providers, identical penalties for vastly different violations, enforcement disregarding provider circumstances.
Procedural Safeguards (Paragraph 6): 'Member States shall lay down specific rules and procedures for the exercise of the powers pursuant to paragraphs 1, 2 and 3 and shall ensure that any exercise of those powers is subject to adequate safeguards laid down in the applicable national law in compliance with the Charter and with the general principles of Union law.' National procedural frameworks must implement: Privacy protection - 'right to respect for private life' requires: warrants or judicial authorization for intrusive investigations, confidentiality protection for sensitive business information, data minimization in information requests (only request necessary information), secure handling of confidential provider data. Rights of defence - 'including the rights to be heard and of access to the file': providers must receive: notice of suspected violations with sufficient detail to respond, opportunity to submit written and oral observations, access to DSC investigation file (subject to confidentiality protecting third parties and investigation integrity), right to legal representation. Effective judicial remedy - 'right to an effective judicial remedy of all affected parties': providers can judicially challenge: DSC decisions (cessation orders, fines, periodic penalties), investigative orders (information requests, inspection orders), interim measures, any enforcement action affecting rights. Courts can: review legality (did DSC act within powers? Follow procedures? Respect fundamental rights?), assess proportionality, annul unlawful decisions, reduce excessive fines. General EU law principles: legality (enforcement only based on law, not arbitrary), legal certainty (clear rules enabling providers to know obligations), non-retroactivity (penalties only for violations of rules in force when conduct occurred), ne bis in idem (no double jeopardy - can't be punished twice for same violation).
Key Points
DSCs have power to require providers and third parties to provide information without undue delay
DSCs can conduct or request judicial authority to order premises inspections
Staff interviews permitted with consent to give explanations about suspected infringements
DSCs can accept binding commitments from providers for compliance
Power to order cessation of infringements and impose proportionate remedies
Fines up to 6% of global annual turnover for DSA violations (Article 52)
Periodic penalty payments up to 5% of average daily turnover to ensure compliance
Interim measures available to avoid risk of serious harm pending final decision
Extraordinary power to request judicial orders for temporary access restriction
Two types: management action plans or temporary content/platform blocking
Access restriction for 4 weeks with possible extensions, subject to judicial review
All measures must be effective, dissuasive and proportionate
National procedural laws must ensure safeguards respecting fundamental rights
Rights of defence include: right to be heard, access to file, effective judicial remedy
Graduated enforcement approach from cooperation to nuclear option of access blocking
Practical Application
Investigation Power Exercise - German Bundesnetzagentur Example: BNetzA (German DSC) investigating suspected Article 16 notice-and-action violations by mid-sized social media platform: (1) Initial information request - BNetzA issues written request under Article 51(1)(a) requiring platform provide within 14 days: statistics on illegal content notices received past 12 months (number, categories, languages), average response times for notice processing, content removal rates by illegality category, documentation of notice-and-action procedures and training materials, explanation of technical systems for notice intake and processing. Platform legal team coordinates response gathering data from content moderation, legal, engineering departments. (2) Follow-up information requests - Initial response reveals potential issues: notices in Turkish language show 3x longer processing times than German notices, concerning 95% of terrorist content notices result in no action citing 'insufficient evidence.' BNetzA issues follow-up requests: detailed explanation of Turkish language moderation capacity (number of Turkish-speaking moderators, training levels, workload), terrorist content moderation policies and training materials, sample of 50 terrorist content notices with decisions and reasoning, information about detection tools and hash-sharing participation. (3) Premises inspection - Information responses raise further concerns about inadequate procedures. BNetzA schedules on-site inspection under Article 51(1)(b) of platform's Berlin office, giving 2 weeks notice per national procedural law. Inspection team (3 BNetzA investigators, IT forensics specialist) arrives, examines: content moderation workspace and tools, moderator training documentation, algorithm code for notice prioritization, internal emails discussing terrorist content policy, database of flagged content with moderator decisions. Investigators take copies of relevant documents and code. (4) Staff interviews - BNetzA conducts Article 51(1)(c) interviews with consent: content policy director explaining policy decisions, head of trust & safety describing moderation workflows, three Turkish-language moderators discussing workload and decision-making, engineering lead explaining algorithmic systems. Interviews recorded (with consent) and transcribed.
Investigation findings: Platform has inadequate Turkish language moderation (2 moderators handling 500+ daily notices in Turkish), terrorist content policy excessively narrow (only removes content explicitly praising terrorist organizations, not recruitment or glorification), no participation in GIFCT hash database limiting detection of known terrorist content. BNetzA proceeds to enforcement.
Enforcement Powers - Graduated Response Example: Based on above investigation, BNetzA employs graduated enforcement: (1) Preliminary findings letter - BNetzA sends preliminary findings identifying apparent violations: insufficient notice-and-action capacity for Turkish content violating Article 16, inadequate terrorist content policies failing to comply with German criminal law (NetzDG), systematic failure to act on manifestly illegal content. Platform given 4 weeks to respond. (2) Commitment procedure - Rather than contested proceedings, platform offers commitments under Article 51(2)(a): hire minimum 10 additional Turkish-language moderators within 60 days, revise terrorist content policy within 30 days to align with German law and EU terrorist content regulation, join GIFCT hash-sharing within 45 days, provide quarterly reports to BNetzA on Turkish language notice processing for 24 months, undergo independent audit of notice-and-action compliance after 12 months. BNetzA evaluates commitments, seeks public consultation (30 days), accepts commitments as binding decision. Platform implements commitments; BNetzA monitors through quarterly reports. Advantages: faster resolution (3 months vs. 12+ months for contested proceedings), tailored remedies addressing specific deficiencies, platform cooperation improving implementation quality. (3) Alternative: Formal infringement decision - If platform refused commitments or offered inadequate ones, BNetzA would issue formal cessation order under Article 51(2)(b): finding Article 16 violations based on inadequate capacity and policies, ordering platform within 90 days: hire sufficient Turkish moderators (at least 10), revise policies to comply with German law, implement hash-matching, provide transparency reporting. Order accompanied by fine under Article 51(2)(c): calculated based on violation severity (high - terrorist content), duration (12+ months), recurrence (ongoing systematic failure), provider capacity (significant revenue enabling compliance). Example fine: €2.5 million (representing 0.5% of platform's annual turnover). Platform can judicially challenge order and fine; if upheld but platform fails to comply, BNetzA imposes periodic penalty payments under Article 51(2)(d): €25,000 per day until full compliance (5% daily turnover cap would be ~€140,000/day for this platform, so €25k is proportionate). After 100 days non-compliance = €2.5 million additional penalties; mounting costs force compliance.
Interim Measures - CSAM Emergency Example: Irish DSC (Coimisiún na Meán) receives reports from Europol that specific platform is systematically failing to remove child sexual abuse material (CSAM), with 200+ instances remaining accessible weeks after reporting to platform. Investigation commenced but will take months; meanwhile children continue to be victimized. DSC adopts interim measures under Article 51(2)(e): (1) Urgency assessment - risk of serious harm: ongoing child victimization, CSAM distribution, potential creation of new CSAM if victims identified by predators. Normal enforcement too slow (investigation + decision = 6-12 months). Prima facie evidence of Article 16 violations (manifestly illegal content not removed). (2) Interim order content - DSC orders platform (effective immediately, pending final investigation): implement PhotoDNA/hash-matching technology within 7 days to detect known CSAM, reduce CSAM notice review time to maximum 2 hours, provide DSC daily reporting on CSAM notices and removals, allow DSC remote access to anonymized CSAM moderation queue for monitoring, designate crisis contact point for CSAM issues available 24/7. (3) Platform response - platform implements immediately to avoid potential access restriction and demonstrates good faith during ongoing investigation. DSC monitors daily compliance through required reporting. (4) Final decision - investigation completes 8 months later, confirming violations. DSC issues cessation order requiring permanent implementation of interim measures plus additional safeguards, imposes €5 million fine for past violations. Interim measures successfully prevented ongoing harm while full proceedings progressed.
Access Restriction - Terrorism Content Example (Hypothetical): Following terrorist attack in EU Member State, French ARCOM (DSC) identifies messaging platform systematically hosting terrorist recruitment and attack planning content. Conditions for Article 51(3) access restriction: (1) Prior enforcement failure - ARCOM issued multiple cessation orders over 18 months requiring platform remove terrorist content and implement content moderation. Platform consistently failed to comply, removing only specific content when explicitly identified but not implementing proactive moderation. Thousands of terrorist content pieces remain accessible. (2) Criminal offence threatening life - content includes: recruitment for terrorist organizations (French Criminal Code Article 421-2-6), provocation of terrorist acts (Article 421-2-5), attack planning materials, instructions for explosive fabrication. Clear threat to life and safety. (3) Urgency and normal enforcement ineffectiveness - mounting evidence platform is actively used for planning imminent attacks. Fines and periodic penalties have not achieved compliance (platform potentially judgment-proof or willing to absorb penalties). Immediate action necessary to prevent loss of life. (4) Consultation process - ARCOM initiates Article 51(3) procedure: publishes notice of intent to request access restriction, provides 2-week consultation period (minimum), invites submissions from platform, affected users, civil society, other stakeholders. Platform argues: content moderation efforts are ongoing, access restriction violates freedom of expression, technical challenges prevent comprehensive removal, requests more time to improve systems. Civil society raises concerns: access restriction sets dangerous precedent, risks censorship creep, may drive users to less regulated platforms, collateral impact on lawful expression. ARCOM considers submissions but determines access restriction necessary given imminent life threat and year+ of failed compliance.
(5) Judicial application - ARCOM petitions Paris judicial authority requesting temporary access restriction order under Article 51(3)(b): targeting specific terrorist content (preferred option), or if platform refuses content-specific blocking, entire platform interface. Evidence submitted: documentation of 18 months non-compliance, expert reports on terrorist content extent, intelligence assessments of attack planning activity, evidence platform technically capable of content-specific blocking but refusing to implement. (6) Judicial proceedings - Platform participates in hearing: presents evidence of moderation efforts, argues proportionality concerns, requests less restrictive alternatives (independent monitor, court-supervised compliance program). Judicial authority evaluates: Are Article 51(3) conditions met? Yes - prior non-compliance, criminal offence with life threat, urgent situation. Is access restriction proportionate? Content-specific blocking proportionate if platform implements; whole-platform blocking disproportionate given lawful communications. Can less restrictive measures succeed? 18 months of failures suggests no. (7) Judicial order - Court orders: platform must within 48 hours implement content-specific blocking of identified terrorist content (list provided by ARCOM based on investigation), establish real-time monitoring system with French law enforcement and ARCOM oversight, provide hourly reports for first week then daily reports on terrorist content detection and removal. Order valid 4 weeks with possibility of 2 extensions (max 12 weeks total). If platform fails to comply within 48 hours, blocking order converts to whole-platform access restriction (ISPs ordered to block platform). (8) Platform compliance - Facing imminent platform-wide blocking, platform implements required measures within 24 hours. ARCOM monitors compliance through required reporting and oversight access. Platform removes thousands of terrorist content pieces in first week. (9) Extension and review - After initial 4 weeks, substantial content remains and new terrorist material continues appearing. ARCOM requests extension; judicial authority reviews, grants 4-week extension contingent on platform making substantial progress. After second 4-week period, platform has implemented sustained moderation system successfully removing terrorist content. Judicial authority declines third extension as conditions no longer met; access restriction expires but platform must maintain systems per permanent cessation order.
Proportionality Assessment - Small Platform Example: Lithuanian DSC investigates small local forum (500 daily users, operated by individual running on minimal advertising revenue) for suspected Article 15 transparency reporting violations. Investigation confirms forum failed to publish annual transparency report required by Article 15. DSC must decide proportionate enforcement: (1) Violation assessment - Article 15 applies to all intermediary services, no SME exception. Violation is clear: no transparency report published. Nature: procedural violation (missing report), not substantive harm. Gravity: low - affects transparency not user safety. Duration: 6 months since report due. Recurrence: first offense. Provider capacity: micro enterprise, limited resources, likely unaware of obligation. (2) Proportionate response options: Disproportionate: €50,000 fine (would bankrupt micro enterprise for procedural violation, excessive given capacity), immediate periodic penalties (would force shutdown), formal cessation order with legal proceedings (disproportionate process for minor violation by small provider likely acting in good faith). Proportionate: Warning letter explaining obligation with deadline to comply (30 days), offer template/guidance to facilitate compliance, commitment to publish annual reports going forward, minimal administrative fine (€1,000) reflecting violation but considering capacity. DSC chooses warning + guidance + commitment approach. Forum operator apologizes, publishes report using provided template, commits to annual reporting. No fine imposed given good faith rapid compliance. (3) Contrast: Repeat offender - If same forum later failed to publish report despite warning, proportionate response escalates: €5,000 fine reflecting deliberate non-compliance, formal cessation order with specified reporting deadline, periodic penalties if continued non-compliance (€100/day reflecting small capacity but incentivizing compliance). Graduated proportionate response balances enforcement effectiveness with provider circumstances.
Procedural Safeguards - Contested Enforcement Example: Dutch ACM (DSC) investigates major platform for suspected Article 30 marketplace trader verification violations. Platform contests findings: (1) Investigation - ACM issues Article 51(1) information requests for trader verification data. Platform provides information but redacts some data citing commercial confidentiality. ACM determines redactions excessive, threatens periodic penalties for incomplete response. Platform challenges information request in Dutch court: argues request overbroad (requests commercially sensitive information not necessary for investigation), violates privacy rights (demands personal data of millions of traders), exceeds ACM authority. Court applies Article 51(6) safeguards: reviews request proportionality, determines some requests justified (trader verification procedures, sampling of verification records), others overbroad (complete database of all traders unnecessary). Court orders platform provide verification procedures + 1,000 randomly sampled trader verifications, protects remaining data. ACM proceeds with narrowed information set.
(2) Preliminary findings - ACM investigation finds systemic Article 30 violations: 30% of sampled traders not properly verified, verification procedures inadequate. ACM sends preliminary findings to platform. Platform exercises right to be heard: submits 50-page response challenging findings, provides evidence of verification efforts, argues ACM misinterprets Article 30 requirements, requests oral hearing. ACM grants oral hearing; platform presents arguments with legal counsel. ACM considers response, maintains violation findings but adjusts some details based on platform evidence. (3) Final decision - ACM issues formal decision: finds Article 30 violations, orders platform implement enhanced trader verification within 180 days (specific requirements detailed), imposes €8 million fine (1% of platform's turnover, reflecting serious compliance failure but considering partial verification efforts), sets periodic penalties for non-compliance (€100,000/day capped at additional €10 million). Decision provides: detailed reasoning explaining violation findings, evidence relied upon, proportionality assessment, information about judicial review rights. (4) Judicial review - Platform appeals to Dutch courts under Article 51(6) effective remedy right: challenges violation findings (argues ACM misinterpreted Article 30), disputes fine amount (argues disproportionate), requests suspension of decision pending appeal. Court reviews: ACM decision legality, evidence sufficiency, proportionality of remedies. Court partially upholds ACM: confirms violations but reduces fine to €5 million (finding €8M excessive given platform's verification efforts, even if inadequate). Cessation order and periodic penalties upheld. Platform must comply during appeal; fine reduced from €8M to €5M.
Cross-Border Complexity - Article 56 Interaction: German platform serves users across EU. Multiple DSC enforcement: (1) Establishment determination - Under Article 56(1), platform's primary establishment is Germany (main office, decision-making center). BNetzA is competent authority of establishment. However, Article 56(2) allows other Member States' DSCs to enforce regarding obligations owed to recipients in their territory. (2) Parallel investigations - French ARCOM investigates platform for inadequate French-language content moderation violating Article 16. Irish DSC investigates same platform for Article 27 recommender system violations affecting Irish users. BNetzA investigates for Article 15 transparency reporting deficiencies. Three parallel investigations by different DSCs. (3) Coordination obligations - Articles 57-60 require DSC cooperation: BNetzA (authority of establishment) coordinates information sharing among investigating DSCs, DSCs exchange evidence and findings avoiding duplicative information requests, BNetzA seeks other DSCs' opinions before major enforcement decisions. (4) Enforcement actions - ARCOM issues Article 51 cessation order for French moderation capacity improvements, Irish DSC accepts Article 51(2)(a) commitments on recommender transparency, BNetzA imposes Article 51(2)(c) fine for transparency failures. Platform faces multiple enforcement actions from different DSCs for different violations—lawful but complex. (5) Consolidation potential - Article 60 joint investigations could consolidate: BNetzA, ARCOM, and Irish DSC conduct coordinated investigation addressing all suspected violations, issue joint recommendations, possibly consolidate enforcement avoiding multiple parallel proceedings. Platform benefits from coordinated approach; DSCs benefit from shared resources and consistent enforcement.
For Platforms Facing Investigation/Enforcement: (1) Information request response - when receiving Article 51(1)(a) information request: respond within stated deadline (typically 14-30 days depending on complexity), provide complete and accurate information (false/misleading information may trigger separate penalties), protect legitimately confidential information while providing substantive responses, consider voluntarily providing additional context helpful to DSC understanding, maintain records of information provided for potential later proceedings. (2) Cooperation vs. contesting - strategic considerations: Cooperation approach: work constructively with DSC, offer commitments under Article 51(2)(a) resolving concerns without contested proceedings, implement improvements demonstrating good faith, may result in reduced penalties and faster resolution. Contesting approach: challenge DSC interpretations and evidence through right to be heard, prepare for judicial review if decision unfavorable, may achieve better legal outcome but prolongs proceedings and may result in higher penalties if violations confirmed. Hybrid: cooperate with investigation while preserving legal challenges to contested issues. (3) Interim measures compliance - Article 51(2)(e) interim measures often require rapid implementation (days/weeks). Platforms should: treat seriously (non-compliance may lead to more severe enforcement including access restriction), implement even if believing interim measures unjustified (can challenge while complying), document compliance efforts for DSC monitoring, prepare parallel track to challenge interim measures if disproportionate.
For Users and Civil Society: (1) Triggering DSC investigations - civil society can bring suspected DSA violations to DSC attention: submit detailed evidence of systematic violations (not individual content decisions but patterns), provide quantitative analysis showing widespread non-compliance, coordinate submissions from multiple affected users/organizations, request DSC exercise Article 51 investigation powers. (2) Participating in enforcement proceedings - where DSC enforcement affects public interest: submit comments during consultation periods (e.g., commitment procedures), request participant status if demonstrating legitimate interest, provide independent expert analysis supporting or challenging DSC findings, monitor enforcement actions and publicly report on DSC performance. (3) Access restriction oversight - given fundamental rights implications, civil society should: closely monitor any Article 51(3) access restriction requests, participate in judicial proceedings as amicus or intervenor if permitted by national law, document restrictions and assess proportionality, publicly report on access restriction use identifying any overreach, challenge disproportionate restrictions through judicial review or advocacy.