Article 37

Audit report

1. Upon receipt of the audit report referred to in Article 36, the provider of very large online platforms or of very large online search engines shall communicate the audit report, and, where applicable, an accompanying implementation report setting out the actions that the provider has taken or intends to take, where applicable, to address any shortcomings identified in the audit report, to the Digital Services Coordinator of establishment and to the Commission, without undue delay.

2. The provider of very large online platforms or of very large online search engines shall make publicly available a clear and comprehensible summary of the audit report. The provider shall ensure that the summary does not contain confidential information, in particular in relation to business secrets.

Understanding This Article

Article 37 closes the audit accountability loop established by Article 36. Conducting audits is valuable, but value multiplies when findings reach regulators (enabling enforcement) and public (enabling democratic accountability). Article 37 mandates platforms submit full audit reports to regulators and publish public summaries, converting private audit processes into transparent compliance mechanisms.

Paragraph 1 establishes regulatory reporting: platforms must 'communicate the audit report' to their Digital Services Coordinator (DSC - the national regulator of platform's EU establishment Member State) and to Commission 'without undue delay'. This timing requirement prevents platforms from sitting on unfavorable audit findings or delaying submission until problems fixed. If Meta's audit completes in January, Meta must submit to Irish DSC and Commission promptly (likely within days/weeks, not months). The phrase 'where applicable, an accompanying implementation report' requires platforms to explain remedial actions for identified shortcomings. If audit finds Facebook's risk assessment inadequate or mitigation ineffective, Meta must describe: what actions taken to address findings, what actions planned, timeline for implementation, how effectiveness will be measured. This transforms audit findings from criticism into action plans.

The submission to both DSC and Commission serves dual purposes: DSC (national regulator) has primary supervisory responsibility for platforms established in their Member State, but Commission has oversight role for VLOPs given their EU-wide significance. Irish DSC supervises Meta day-to-day, but Commission monitors to ensure consistent VLOP enforcement across Member States. Both regulators receiving audit reports enables coordinated supervision, prevents national regulators from being too lenient with 'their' platforms, and empowers Commission to intervene if needed.

Paragraph 2 mandates public transparency: platforms must 'make publicly available a clear and comprehensible summary of the audit report'. Not the full detailed audit report (which may contain proprietary information, algorithm details, security vulnerabilities), but summary accessible to general public. 'Clear and comprehensible' means avoiding technical jargon or obfuscation - summary should enable informed readers to understand audit scope, methodology, findings, and conclusions. If YouTube's audit examined recommendation algorithm, summary should explain: what auditor examined, what they found, whether YouTube complies with obligations, what deficiencies identified. The requirement to exclude 'confidential information, in particular business secrets' balances transparency with legitimate commercial confidentiality. Platforms needn't disclose algorithm source code, trade secrets, or competitively sensitive information, but must disclose enough for public accountability.

Article 37's power is in democratizing compliance oversight. Pre-DSA, platforms self-reported compliance selectively, regulators lacked detailed technical information, and public had minimal visibility into platform operations. Post-DSA Article 37, independent experts examine platforms thoroughly, regulators receive detailed findings, and public can read summaries. If TikTok's audit finds data localization claims unverified, that becomes public knowledge. If Twitter's audit identifies crisis response deficiencies, civil society can demand improvements. If LinkedIn's audit reveals algorithm bias, affected users can advocate for fixes. Transparency converts compliance from backroom process into public accountability mechanism.

Article 37 also creates enforcement triggers. Regulators reviewing audit reports can identify compliance failures requiring investigation, penalties, or mandatory improvements. If audit finds platform ignored obligations or made false representations, regulators have documented evidence from independent experts. Audit reports become foundation for enforcement actions, not just advisory documents. The combination of Articles 36 (conduct audits), 37 (report findings), and Chapter 4 (regulatory enforcement) creates comprehensive oversight: audit → report → supervise → enforce.

Key Points

  • Platforms must submit audit reports to Digital Services Coordinator (DSC) and Commission 'without undue delay'
  • If audit identifies shortcomings, platform must submit implementation report describing remedial actions taken or planned
  • Platform must publish 'clear and comprehensible summary' of audit report publicly
  • Summary must not contain confidential information or business secrets
  • Creates transparency loop: auditors examine → platforms report to regulators → public learns findings
  • Regulators receive detailed audit reports enabling informed supervision
  • Public summary enables civil society, researchers, media to monitor platform compliance
  • Implementation report creates accountability for addressing audit findings

Practical Application

For Meta (Facebook/Instagram - Regulatory Submission): After Meta's annual independent audit completes (e.g., March 2024), Meta receives comprehensive audit report from auditing organization. Report documents: auditor's examination of Article 34 risk assessments, testing of Article 35 mitigation effectiveness, verification of Section 5 compliance, findings about deficiencies or non-compliance, recommendations for improvement. Meta must: (1) Submit full audit report to Irish Data Protection Commission (Meta's DSC as Irish-established company) and European Commission within days, not weeks. Submission includes complete audit documentation: auditor methodology, data examined, interviews conducted, testing results, findings, conclusions, recommendations. (2) If audit identifies shortcomings (e.g., 'Meta's risk assessment underestimated teen mental health impacts from Instagram Reels algorithm'), prepare implementation report detailing: acknowledgment of finding, root cause analysis, corrective actions already taken, planned improvements with timeline, metrics for measuring effectiveness, responsible executives. Submit implementation report with audit report. (3) Regulators review: Irish DSC and Commission analyze audit findings, assess Meta's implementation commitments, determine if additional regulatory action needed (investigation, enforcement, mandatory improvements). (4) Meta uses audit findings internally: update risk assessments, strengthen mitigation, improve processes, allocate resources to address deficiencies. Next year's audit verifies whether improvements implemented effectively.

For Meta (Public Summary Publication): Separately, Meta must publish public summary of audit report on prominent location (e.g., Meta Transparency Center website). Summary must be 'clear and comprehensible' to informed general readers, not just technical experts. Example summary structure: (1) Introduction: 'Meta commissioned [Auditor Name], an independent Commission-approved auditing organization, to conduct annual DSA Article 36 audit covering period January-December 2023.' (2) Audit Scope: 'Auditor examined: systemic risk assessments for Facebook and Instagram EU operations, risk mitigation measure effectiveness, content moderation systems, advertising transparency, recommender system compliance, crisis response protocols, compliance function adequacy.' (3) Methodology: 'Auditor reviewed documentation, interviewed personnel, tested systems, analyzed data representing X million EU users and Y billion content decisions.' (4) Key Findings: 'Auditor found Meta substantially compliant with DSA Section 5 obligations with following areas for improvement: (a) Risk assessment should expand analysis of Reels algorithm impact on teen mental health; (b) Advertising transparency tools require usability improvements for non-technical users; (c) Crisis response protocols should include more frequent simulation exercises.' (5) Meta Response: 'Meta has implemented or is implementing following actions: enhanced teen mental health risk analysis completed Q1 2024, advertising transparency UI redesign launched Q2 2024, quarterly crisis simulation exercises initiated.' (6) Conclusion: 'Auditor concluded Meta's DSA compliance framework is generally robust with identified improvements underway.' Summary excludes confidential details (algorithm parameters, specific vulnerability information, trade secrets) but provides enough substance for public accountability. Civil society organizations, researchers, journalists, policymakers can understand Meta's compliance status and hold company accountable.

For TikTok (High-Stakes Public Summary): TikTok's audit report summary carries heightened significance given ongoing scrutiny of Chinese ownership and data practices. Public summary becomes critical trust signal. If audit confirms TikTok's claims (EU data localized, no Chinese government access, content moderation independent), summary builds credibility: 'Independent auditor verified EU user data stored exclusively in European data centers with access controls preventing transfer to China. Auditor examined content moderation decisions finding no evidence of Chinese government influence. Auditor tested For You algorithm operations confirming minor protection measures function as described.' This transparent validation counters skepticism. Conversely, if audit identifies issues, summary must honestly disclose: 'Auditor identified instances where EU user data accessed from non-EU locations contrary to company policy. TikTok implementation report commits to technical controls preventing unauthorized access with completion date Q3 2024.' Attempted cover-up would backfire spectacularly if leaked. TikTok's strategy: commission rigorous audit, address findings promptly, publish transparent summary demonstrating good-faith compliance even when imperfect. Public summary becomes accountability mechanism and trust-building opportunity.

For Twitter/X (Documenting Post-Ownership Changes): Twitter/X's audit report becomes historical record of post-Musk ownership impact on DSA compliance. Public summary documents: 'Auditor examined impact of 2022-2023 organizational changes on DSA obligations. Findings: (1) Staff reductions decreased content moderation capacity from X reviewers to Y reviewers, increasing average review time from A hours to B hours; (2) Policy changes regarding account reinstatement created Z new risks for coordinated inauthentic behavior; (3) Verification system modifications reduced identity verification rigor; (4) Crisis response protocols remain functional but capacity constraints create response time delays.' Summary provides objective documentation cutting through politicized debates. If audit finds compliance degraded, summary creates public pressure for corrective action. If audit finds compliance maintained despite changes, summary validates Twitter's approach. Implementation report must describe: 'Twitter commits to: reinstating dedicated election integrity team by Q2 2024, implementing enhanced verification for high-profile reinstated accounts, conducting crisis response capacity assessment with mitigation plans by Q3 2024.' Public can monitor whether commitments fulfilled in next audit cycle.

For LinkedIn (Professional Platform Audit Transparency): LinkedIn's audit report summary focuses on professional platform compliance: 'Independent audit examined LinkedIn's job recommendation algorithm fairness, recruiter tool usage patterns, credential verification processes, employment scam detection effectiveness. Key findings: (1) Algorithm bias testing demonstrated generally fair outcomes with minor age-related disparities in certain job categories requiring adjustment; (2) Recruiter tool usage analysis identified small number of accounts using potentially discriminatory filters - accounts suspended pending investigation; (3) Credential verification sampling found 98% accuracy with improvements recommended for international degree validation; (4) Employment scam detection prevented X fraudulent listings with Y false positives requiring filter refinement.' Summary demonstrates transparency about both successes and areas for improvement, building trust with professional user base concerned about fairness and integrity. Implementation report details algorithm bias corrections, recruiter tool usage policy updates, credential verification enhancements, scam detection calibration - demonstrating responsive compliance rather than defensive denial.

For Regulators (Using Audit Reports for Supervision): Digital Services Coordinators and Commission use audit reports as foundation for informed supervision. Irish DSC receiving Meta's audit report analyzes: Are identified deficiencies significant compliance failures? Do they indicate systemic problems? Is implementation report adequate? Should DSC conduct own investigation? Does situation warrant enforcement action or is platform's committed remediation sufficient? DSC may: (1) Accept audit findings and monitor implementation, (2) Request additional information or clarification, (3) Conduct own examination of identified issues, (4) Initiate formal investigation if audit reveals serious violations, (5) Issue binding decisions requiring specific actions beyond platform's implementation commitments. Commission reviews audit reports from all VLOPs identifying: cross-platform compliance patterns, inadequate national supervision, need for EU-wide guidance. If multiple platform audits identify similar deficiencies (e.g., crisis protocols consistently inadequate), Commission issues guidance improving future compliance. Audit reports transform from platform documents into regulatory intelligence enabling sophisticated oversight.