Article 36

Independent audit

1. Providers of very large online platforms and of very large online search engines shall carry out audits to assess compliance with the following:

(a) the obligations set out in this Section;

(b) any commitments undertaken under the codes of conduct referred to in Article 45;

(c) any commitments taken in the crisis protocols referred to in Article 48.

2. The audits shall be carried out by organisations which are independent of the providers of very large online platforms and of very large online search engines and which possess the appropriate competence in the fields of risk management, technical expertise and the drawing up of audit reports. Those organisations shall be subject to approval in accordance with paragraph 4.

3. For the purposes of paragraph 1, the providers of very large online platforms and of very large online search engines shall:

(a) conduct the audits at their own expense and at least once a year;

(b) ensure that auditors have access to all relevant data and premises and full cooperation of the provider of very large online platforms or of very large online search engines, as applicable;

(c) consider the results of the audits and take any necessary remedial action in a timely manner.

4. The Commission, assisted by the Board, may adopt implementing acts with a view to laying down standard terms of reference for audits. Those implementing acts shall be adopted in accordance with the examination procedure referred to in Article 88(2).

Understanding This Article

Article 36 creates the independent verification mechanism for VLOP obligations - especially Articles 34-35's risk assessment and mitigation. Without auditing, risk management could become performative: platforms document risks, claim mitigation, but lack external accountability. Article 36 mandates independent expert review of whether platforms genuinely comply with obligations, measure risks accurately, implement effective mitigation, and follow through on commitments.

Paragraph 1 defines audit scope: (a) All Section 5 obligations (Articles 33-43, including risk assessment, mitigation, crisis protocols, data access, compliance function); (b) Code of conduct commitments under Article 45 (industry self-regulatory agreements); (c) Crisis protocol commitments under Article 48 (rapid response mechanisms). This comprehensive scope means auditors examine not just whether risk assessments exist, but whether they're rigorous, whether mitigation measures are effective, whether crisis responses function, whether commitments are fulfilled.

Paragraph 2 establishes auditor requirements: independence, competence, approval. 'Independent of the providers' prevents conflicts of interest - auditors can't be subsidiary entities, financially dependent consultants, or otherwise beholden to platforms. Consulting firms that also provide paid services to platforms face independence questions. 'Appropriate competence in risk management, technical expertise, and audit report drawing' means auditors need multidisciplinary skills: understanding algorithmic systems, content moderation operations, data analytics, regulatory compliance, audit methodology. 'Subject to approval' means Commission/Board must approve auditing organizations, creating quality control for audit industry.

Paragraph 3 establishes audit logistics: (a) Platforms pay audit costs and conduct audits 'at least once a year'. Annual cycle aligns with Article 34's annual risk assessments - each year's audit reviews previous year's assessment and mitigation. Platforms bear costs, preventing 'you get what you pay for' dynamics where cheap audits produce weak findings. (b) Platforms must provide auditors 'access to all relevant data and premises and full cooperation'. Auditors can request algorithm source code, content moderation training materials, internal communications, risk assessment documentation, mitigation effectiveness metrics, user data (subject to privacy protections). Platforms can't hide embarrassing information or obstruct audits. (c) Platforms must 'consider results' and 'take necessary remedial action in timely manner'. If audit finds risk assessment inadequate or mitigation ineffective, platform must address findings, not ignore them. 'Timely' prevents indefinite delay on fixes.

Paragraph 4 authorizes Commission to adopt standard audit terms of reference, assisted by Board (European Board for Digital Services - regulatory coordination body). Standard terms could specify: audit methodology, documentation requirements, sampling approaches, testing procedures, reporting format, timeline. This creates consistency across platform audits while allowing auditor professional judgment on platform-specific risks.

Article 36's power is in creating triangular accountability: platforms self-assess (Article 34), regulators supervise (Chapter 4), and independent auditors verify (Article 36). Auditors bring technical expertise regulators may lack, independence platforms lack, and external credibility self-assessments lack. If YouTube claims recommendation algorithm changes reduced radicalization risk, auditors can examine algorithm code, test recommendation patterns, analyze user exposure data, interview machine learning engineers, and verify whether mitigation actually worked. If Facebook claims advertising restrictions address discrimination, auditors can audit ad targeting systems, test for discriminatory outcomes, review advertiser compliance, verify enforcement actions.

Article 36 also creates new industry: DSA auditing. Major consulting/accounting firms, specialized digital platform auditors, academic research institutions, civil society watchdog organizations may seek approval as auditors. This creates market for audit services, competitive dynamics on audit quality, and potential auditor specialization (e.g., algorithm auditors, content moderation auditors, child safety auditors).

Key Points

  • VLOPs/VLOSEs must undergo annual independent audits at their own expense
  • Audits verify compliance with: Articles 34-35 (risk assessment/mitigation), all Section 5 obligations, code of conduct commitments, crisis protocol commitments
  • Auditors must be independent of platform, possess appropriate competence in risk management and technical expertise, subject to approval
  • Platforms must provide auditors full access to data, premises, and cooperation
  • Platforms must consider audit results and take timely remedial action for identified issues
  • Commission may adopt standard audit terms of reference via implementing acts
  • Creates external verification mechanism for self-assessment (Articles 34-35)
  • Audits complement regulatory supervision with independent expert review

Practical Application

For Meta (Facebook/Instagram VLOPs): Meta must procure annual independent audits from approved auditing organization. Process: (1) Select independent auditor (e.g., major consulting firm with platform expertise, specialized digital audit organization) from Commission-approved list. Ensure auditor independence - no significant consulting contracts, no financial ties, no shared personnel. (2) Provide auditor comprehensive access: risk assessment reports, mitigation implementation documentation, algorithm code and documentation, content moderation statistics and training materials, advertising system architecture, crisis response procedures, code of conduct compliance evidence. (3) Auditor conducts examination: reviews Article 34 risk assessment for rigor (did Meta identify all significant risks? use appropriate methodology?), tests Article 35 mitigation effectiveness (did News Feed algorithm changes reduce polarization? are advertising restrictions enforced?), verifies compliance with all Section 5 obligations, examines code of conduct commitments, tests crisis protocol readiness. (4) Auditor produces audit report documenting findings, identifying deficiencies, recommending remediation. (5) Meta reviews findings, implements remedial actions: if audit finds risk assessment underestimated teen mental health impacts, conduct supplemental assessment; if mitigation measures proved ineffective, strengthen interventions; if code compliance failed, address gaps. (6) Document remedial actions for next year's audit and regulatory review. Audit cost: potentially millions of euros for comprehensive examination of global platforms with billions of users. Repeat annually. Publish audit results or summaries (transparency requirement elsewhere in DSA).

For TikTok (Chinese-Owned VLOP): TikTok faces heightened audit scrutiny given geopolitical concerns. Must ensure auditor independence is unquestionable - select EU-based auditing organization without Chinese government ties. Audit scope includes standard Article 36 requirements plus likely enhanced scrutiny of: data localization claims (verify EU data stays in EU), Chinese government influence (examine content moderation decisions for political interference evidence), algorithm operations (verify For You algorithm for EU users operates as claimed), minor protection measures (test age verification, content filtering for teens). Given external skepticism, TikTok may need more frequent audits (e.g., semi-annual) or specialized supplemental audits (e.g., dedicated data security audit, algorithm transparency audit). Auditor must verify TikTok's claims about Chinese non-interference through: technical architecture review, access control examination, data flow monitoring, content moderation decision sampling, internal communications review. Audit report becomes critical trust signal - if independent auditor confirms TikTok's claims, builds credibility; if auditor identifies issues, intensifies regulatory scrutiny.

For Twitter/X (Post-Ownership VLOP): Twitter/X's rapid post-2022 changes create unique audit challenges. Audit must examine whether organizational changes (staff reductions, policy shifts, system modifications) maintained or degraded DSA compliance. Auditor reviews: Article 34 risk assessment comprehensiveness given reduced staff (did Twitter accurately assess risks from changes?), Article 35 mitigation adequacy with smaller teams (are remaining measures effective?), crisis protocol functionality (can Twitter respond to elections/emergencies with reduced capacity?), content moderation consistency (did policy changes increase harmful content?). If audit finds compliance degraded, Twitter must address: rehire moderation staff, reverse risky policy changes, restore suspended systems, strengthen remaining safeguards. Audit provides objective assessment cutting through debates about 'free speech absolutism' vs regulatory compliance - either Twitter meets obligations or it doesn't, regardless of owner philosophy. Annual audit cycle creates regular checkpoints preventing compliance drift.

For LinkedIn (Professional Platform VLOP): LinkedIn audit focuses on professional platform risks. Auditor examines: job algorithm bias testing (verify LinkedIn conducts rigorous fairness audits), recruiter tool usage (test for discriminatory filtering), credential verification (sample professional claims for accuracy), employment scam detection (assess fraud prevention effectiveness), professional harassment prevention (examine reporting/response systems). Auditor interviews: data scientists about algorithm fairness, trust & safety teams about harassment handling, product managers about recruiter tools, users about professional integrity concerns. Auditor tests: job recommendations for bias patterns, credential verification for fake qualifications, scam detection for common frauds. Audit report addresses LinkedIn-specific risks (labor market fairness, professional integrity) not just generic platform concerns (viral misinformation). LinkedIn's Article 36 audit demonstrates that different platform types require specialized audit approaches - LinkedIn auditor needs employment law and labor economics expertise, not just social media knowledge.

For Auditing Organizations: Organizations seeking Commission approval as DSA auditors must demonstrate: (1) Independence: no significant financial ties to platforms, no consulting contracts creating conflicts, transparent ownership structure, ethical guidelines preventing compromised audits. (2) Competence: staff with algorithm expertise, content moderation knowledge, data analytics skills, audit methodology training, platform operations understanding, regulatory compliance experience. Multidisciplinary teams combining technical experts, policy specialists, audit professionals. (3) Methodology: rigorous audit procedures, sampling approaches, testing frameworks, documentation standards, quality control, peer review. (4) Resources: sufficient capacity to audit major platforms (reviewing billions of content decisions, examining complex algorithms, analyzing massive datasets). (5) Reputation: track record of rigorous, independent analysis; resistance to client pressure; professional standing. Major candidates: Big Four accounting firms (if they can maintain independence), specialized platform audit firms, academic consortiums, reputable civil society organizations with technical capacity. Article 36 creates market for platform auditing services - organizations invest in capabilities, platforms select auditors, regulators approve qualified organizations. Quality auditing industry emerges as DSA enforcement pillar.