Additional transparency reporting obligations for providers of online platforms
Chapter 3|Due Diligence Obligations - Online Platforms|📖 8 min read
1. In addition to the information referred to in Article 15, providers of online platforms shall include in the reports referred to in that Article information on the following:
(a) the number of disputes submitted to out-of-court dispute settlement bodies pursuant to Article 21, the outcomes of the dispute settlement, the median time needed for completing the dispute settlement procedures and the share of disputes where the out-of-court dispute settlement body ruled in favour of the recipient;
(b) the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal content, for the submission of manifestly unfounded notices and complaints and distinguishing between automated suspensions and suspensions that are decided upon by personnel;
(c) the use made of automated means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.
2. Providers of online platforms shall make publicly available, at least once every six months, in a publicly available section of their online interface and in a machine-readable format, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months.
3. Upon receiving a reasoned request from the Digital Services Coordinator of establishment or the Board, providers of online platforms shall, without undue delay, provide information on the average monthly active recipients of the service, for a specific period indicated in the request not exceeding the previous six months, including the main parameters and calculations used for determining those numbers.
4. The Digital Services Coordinator of establishment or, as applicable, the Board, shall, without undue delay, transmit the information referred to in paragraph 3 to the Commission when it believes that a provider of online platforms may meet the threshold laid down in Article 33(1) and shall without undue delay inform that provider accordingly.
5. Providers of online platforms shall send to the database referred to in Article 42 the decisions and the statements of reasons referred to in Article 17(1). That information shall not contain personal data.
6. The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of the reports pursuant to paragraph 1.
Understanding This Article
Article 24 builds on Article 15's general transparency reporting by adding platform-specific requirements. These additional obligations serve two critical functions: (1) enabling identification of Very Large Online Platforms (VLOPs) through user number reporting, and (2) providing deeper insight into platform content moderation practices including disputes, suspensions, and automation.
Paragraph 1(a) requires reporting on Article 21 out-of-court dispute settlement: how many users escalated disputes, what outcomes resulted, how long settlements took, and how often users won. This data reveals whether platforms' internal moderation is fair (high user win rates suggest platforms frequently err) and whether dispute settlement works efficiently (long resolution times indicate system problems).
Paragraph 1(b) mandates disclosure of Article 23 suspensions: how many accounts/reporting privileges were suspended for illegal content, frivolous complaints, and critically, whether suspensions were automated or human-decided. This transparency prevents invisible automated suspensions and enables assessment of whether platforms over-rely on algorithms for punitive actions.
Paragraph 1(c) addresses THE automation transparency issue - platforms must explain what automated content moderation they use, what purposes it serves, how accurate it is, its error rates, and what safeguards prevent mistakes. For example, if YouTube uses automated copyright detection, they must disclose accuracy rates, false positive rates, and human review safeguards. This transforms algorithmic moderation from black box to accountable system.
Paragraph 2 is THE VLOP trigger mechanism. Platforms must publicly report average monthly active EU recipients every six months. This self-reporting enables the Commission to identify platforms exceeding Article 33's 45 million user threshold, triggering VLOP designation and extensive additional obligations. The machine-readable format requirement enables automated monitoring - regulators and researchers can track which platforms approach VLOP status.
Paragraph 3 gives Digital Services Coordinators power to demand detailed user number explanations on request - platforms can't just report '44.9 million users' to avoid VLOP designation and then refuse to explain calculations. Coordinators can require disclosure of methodologies, active user definitions, and calculation details.
Paragraph 4 creates the VLOP designation pipeline: when Coordinators believe a platform meets the 45 million threshold based on paragraph 2-3 data, they notify the Commission, triggering Article 33's formal designation process. This prevents platforms from hiding growth to avoid VLOP obligations.
Paragraph 5 requires platforms to submit ALL Article 17 content moderation decisions (removal, suspension, restriction decisions and reason statements) to the Commission's database. This creates a centralized repository of platform moderation practices, enabling cross-platform analysis, pattern identification, and systemic oversight. Personal data must be excluded to protect privacy.
The combination of these requirements creates unprecedented platform transparency - regulators, researchers, and civil society can monitor content moderation practices, identify platforms requiring enhanced oversight, and hold platforms accountable for their decisions.
Key Points
Platforms must report out-of-court dispute statistics including outcomes and timelines
Must report suspension statistics under Article 23, distinguishing types and automation
Must disclose use of automated content moderation including accuracy rates and error rates
Must publish average monthly active EU users every six months - CRITICAL for VLOP designation
Coordinators can request detailed user number calculations on demand
Platforms approaching 45 million EU users trigger VLOP designation process
Must submit all content moderation decisions to Commission database
Enables regulatory monitoring and public accountability of platform operations
Practical Application
For User Number Reporting: TikTok publishes on its transparency page every six months: 'TikTok had an average of 134 million monthly active users in the EU over the past six months (calculated from January-June 2024).' The number is publicly accessible, machine-readable (structured data format), and immediately alerts regulators that TikTok far exceeds the 45 million VLOP threshold, maintaining its VLOP designation and extensive compliance obligations.
For Approaching VLOP Threshold: A growing social platform reports: 'We had 42 million average monthly active EU users for July-December 2024.' The Digital Services Coordinator monitors this, seeing the platform rapidly approaching 45 million. They request detailed methodology: 'Explain your active user definition, calculation methods, and growth trends.' The platform provides details. Three months later, the platform reports 46 million users. The Coordinator immediately notifies the Commission, triggering Article 33 VLOP designation within four months.
For Dispute Settlement Reporting: Facebook's annual transparency report includes: 'Out-of-court dispute settlement (Article 21): 15,234 disputes submitted to certified bodies. Outcomes: 45% ruled in favor of users, 38% ruled in favor of Facebook, 17% other outcomes. Median resolution time: 67 days. Analysis: The 45% user win rate suggests Facebook's internal appeals (Article 20) may be too restrictive, as nearly half of escalated cases overturn Facebook's decisions. Median time of 67 days is within the 90-day requirement.'
For Suspension Statistics: YouTube reports: 'Article 23 suspensions in 2024: 12,450 account suspensions for manifestly illegal content (8,200 automated based on hash-matching, 4,250 human-decided). 3,670 suspensions of reporting privileges for manifestly unfounded complaints (95 automated based on submission patterns, 3,575 human-decided). Automation rate: 66% for content suspensions, 2.6% for complaint suspensions.' This reveals YouTube heavily automates content suspensions but requires human review for most complaint abuse cases.
For Automated Moderation Disclosure: Instagram reports: 'We use automated means for: (1) CSAM detection using PhotoDNA hash-matching - 99.8% accuracy, 0.2% false positive rate, all positives human-verified before removal; (2) Hate speech detection using machine learning - 89% accuracy, 11% error rate, borderline cases escalated to human review; (3) Spam detection using behavioral analysis - 94% accuracy, 6% error rate, users can appeal all automated removals per Article 20.' This transparency enables assessing Instagram's automation reliability and whether safeguards are adequate.
For Commission Database Submission: Every time Twitter (X) removes content or suspends accounts, the Article 17 reason statement gets submitted to the Commission database: 'Post removed for hate speech incitement. Content stated: [description]. Violates Article 20(1)(c) of EU hate speech law because it explicitly calls for violence against [group]. Automated detection flagged, human reviewer confirmed. User may appeal via Article 20.' The Commission can analyze millions of such decisions, identifying patterns: Is Twitter disproportionately removing certain viewpoints? Are appeals consistently rejected? Are automated systems reliable?
For Cross-Platform Analysis: Researchers access the Commission database and analyze: 'Comparing TikTok, Instagram, and YouTube hate speech removals: TikTok removes 2.3% of reported hate speech content, Instagram removes 4.1%, YouTube removes 5.8%. Either platforms have different user bases/content, or they apply different standards. Further investigation needed.' This database enables empirical assessment previously impossible.
For Accuracy Rate Skepticism: A platform reports '98% accuracy for automated nudity detection.' Civil society organizations challenge this: 'We've documented dozens of false positives where breastfeeding photos and medical content were wrongly removed. How did you calculate 98%? What's the denominator?' The Coordinator requests detailed methodology under paragraph 3. Platform must disclose: what counts as 'accurate,' whether appeals overturning decisions count as errors, sample sizes used for validation.
For Small Platform Exemptions: A startup platform with 2 million EU users and 30 employees qualifies as small enterprise under Article 19. They're exempt from Article 24 except paragraph 3 - they still must report user numbers when requested by Coordinators, enabling monitoring for potential VLOP designation if they grow. But they skip reporting disputes, suspensions, and automation details until they exceed SME thresholds.
For Regulatory Oversight: The German Digital Services Coordinator analyzes aggregated Article 24 reports across platforms and identifies pattern: platforms report extremely low automated suspension rates but high automated detection rates. This suggests platforms use automation to detect violations but require human review before suspending - good practice balancing efficiency with fairness. Coordinator shares this as best practice guidance.
For Public Accountability: Journalists access platforms' Article 24 reports and Commission database, publishing analysis: 'Major platforms suspended 250,000+ accounts for illegal content in 2024. 85% were automated suspensions with minimal human review. User win rates in dispute settlement average 42%, suggesting platforms over-moderate.' This public scrutiny pressures platforms to improve moderation accuracy.
For Machine-Readable Format: TikTok publishes user numbers as structured JSON data: {"platform":"TikTok", "period":"Jan-Jun 2024", "average_monthly_active_users_EU":134000000, "methodology":"unique logged-in users accessing service at least once per month"}. Automated monitoring systems can parse this, tracking all platforms' growth toward VLOP thresholds without manual data entry.
For Temporal Analysis: Regulators track platform growth over time: 'Platform X reported 38M users (H1 2023), 41M (H2 2023), 44M (H1 2024), 47M (H2 2024). They crossed VLOP threshold in H2 2024. Per Article 33, they have four months from designation to comply with VLOP obligations.' The six-month reporting enables timely designation before platforms become too established to regulate.