Article 72

Monitoring actions

1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by providers of the very large online platform and of the very large online search engines.

The Commission may order them to provide access to, and explanations relating to, its databases and algorithms.

Such actions may include, imposing an obligation on the provider of the very large online platform or of the very large online search engine to retain all documents deemed to be necessary to assess the implementation of and compliance with the obligations under this Regulation.

2. The actions pursuant to paragraph 1 may include the appointment of independent external experts and auditors, as well as experts and auditors from competent national authorities with the agreement of the authority concerned, to assist the Commission in monitoring the effective implementation and compliance with the relevant provisions of this Regulation and to provide specific expertise or knowledge to the Commission.

Understanding This Article

Article 72 establishes Commission's ongoing monitoring framework enabling continuous oversight of very large online platforms (VLOPs) and very large online search engines (VLOSEs) to verify effective DSA implementation and compliance. Provision creates proactive supervision mechanism complementing reactive Article 66 investigation powers - Commission need not await complaint or evidence of violation before monitoring platform practices. Monitoring actions include database and algorithm access, document retention requirements, and expert/auditor appointments providing specialized technical capacity necessary for overseeing complex algorithmic systems.

Scope and Purpose: Article 72 monitoring authority applies exclusively to 'providers of the very large online platform and of the very large online search engines' - platforms designated under Article 33 based on 45 million monthly EU active users threshold. Smaller intermediary services supervised by Member State Digital Services Coordinators under Article 51 not subject to Commission Article 72 monitoring. Purpose limitation constrains monitoring to 'carrying out tasks assigned under this Section' - Section 4 of Chapter IV addresses 'Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines.' Tasks include: Article 66 initiation of proceedings and investigations, Article 67 requests for information, Article 68 interviews and statements, Article 69 inspections, Article 70 interim measures, Article 71 commitments, Article 73 non-compliance decisions, Article 74 fines, Article 75 periodic penalties. Article 72 monitoring serves these enforcement functions by enabling Commission to gather evidence, verify compliance claims, assess systemic risk mitigation effectiveness, detect violations, and develop regulatory expertise about platform practices and algorithmic systems.

Database Access Powers: Commission may 'order them to provide access to, and explanations relating to, its databases' - enables examination of internal platform data repositories documenting content moderation decisions, user complaints, automated detection systems, advertising records, recommender system outputs, and other compliance-relevant information. Database access complements Article 67 information requests by providing direct Commission examination rather than relying solely on provider-curated disclosures. Practical scope includes: (1) Content moderation databases - records of content removal/restriction decisions, appeal outcomes, automated detection flagging, human reviewer assessments, enabling Commission to verify Article 16 notice-and-action, Article 17 statement of reasons, Article 20 internal complaint-handling compliance. (2) User complaint databases - documentation of user reports, platform responses, resolution timelines, enabling assessment of Article 11 points of contact, Article 20 internal complaint system effectiveness. (3) Advertising databases - advertiser identities, targeting parameters, ad delivery records, enabling verification of Article 26 advertising transparency, Article 30 ad repository requirements. (4) Recommender system databases - content ranking scores, personalization signals, algorithmic outputs, enabling assessment of Article 27 transparency and Article 38 recommender system parameters. (5) Risk assessment databases - systemic risk identification records, mitigation measure implementation, crisis protocol activation, enabling verification of Article 34 risk assessment and Article 35 mitigation obligations. Database access orders typically specify: information categories sought, time periods covered, data formats required, delivery timelines, confidentiality protections for commercially sensitive or personal data.

Algorithm Access Powers: Commission may order providers to provide 'explanations relating to, its algorithms' - creates transparency obligation regarding automated systems driving platform functionality. Algorithms subject to Commission access include: recommender systems determining content visibility and ranking (Article 27, 38), automated content moderation systems detecting illegal content and terms of service violations (Article 22), targeted advertising algorithms selecting users for ad delivery (Article 26), search ranking algorithms determining result ordering (VLOSE-specific), fraud detection algorithms, spam filters, age verification systems. 'Explanations' require providers to describe: algorithmic objectives and design principles, input signals and data sources processed, weighting and scoring methodologies, output generation and ranking logic, testing and validation procedures, performance metrics and accuracy rates, bias detection and mitigation measures, human oversight mechanisms. Algorithm access does not necessarily require source code disclosure - Commission seeks functional understanding enabling compliance assessment rather than complete technical specifications. However, in cases where explanations insufficient to evaluate compliance (e.g., providers provide vague descriptions concealing discriminatory targeting or manipulative design), Commission may demand more granular technical documentation under Article 67 information request powers or Article 69 inspection authority.

Document Retention Obligations: Commission may impose 'obligation on the provider to retain all documents deemed to be necessary to assess the implementation of and compliance with the obligations under this Regulation' - prevents evidence destruction or degradation during investigations. Document retention orders specify: document categories covered (internal communications, policy documents, technical specifications, training materials, meeting minutes, audit reports), retention duration (typically aligned with investigation timeline plus potential judicial review period), storage requirements (format preservation, integrity protection, accessibility provisions), and production procedures (how Commission accesses retained documents when needed). Retention obligations particularly important given: (1) Platform systems generate massive data volumes potentially overwritten or deleted absent preservation requirements, (2) Corporate document retention policies may routinely delete communications relevant to compliance assessments, (3) Evidence spoliation risk increases once providers aware of Commission scrutiny, (4) Reconstructing historical platform configurations, algorithmic parameters, policy implementations difficult without contemporaneous documentation. Document retention serves both evidentiary and deterrent functions - creates audit trail supporting enforcement while incentivizing compliance by ensuring violations discoverable ex post.

Independent External Experts - Paragraph 2: Commission may appoint 'independent external experts' to 'assist the Commission in monitoring the effective implementation and compliance' and 'provide specific expertise or knowledge.' Expert appointments address Commission's capacity constraints and knowledge gaps - DSA enforcement requires specialized technical understanding of: algorithmic systems, machine learning, natural language processing, content moderation technologies, recommender system design, online advertising ecosystems, platform architecture, data science, cybersecurity. Commission staff possess legal and regulatory expertise but may lack granular technical knowledge necessary for evaluating algorithmic compliance. External experts provide: (1) Algorithmic auditing capabilities - assess whether recommender systems, content moderation algorithms, ad targeting systems comply with DSA requirements, (2) Technical documentation review - evaluate platform explanations of algorithmic systems for accuracy and completeness, (3) Risk assessment evaluation - assess whether Article 34 risk assessments adequately identify systemic risks and Article 35 mitigation measures effectively address identified risks, (4) Best practice benchmarking - compare platform practices against industry standards and technical state-of-the-art, (5) Novel technology assessment - evaluate emerging technologies (generative AI, virtual reality, brain-computer interfaces) for compliance implications. Expert independence critical to credibility - appointed experts must lack conflicts of interest, financial relationships with assessed platforms, or advocacy positions biasing analysis.

Independent External Auditors - Paragraph 2: Commission may appoint 'independent external auditors' distinct from Article 37 annual audits commissioned by platforms themselves. Dual audit framework creates checks and balances: providers commission Article 37 audits assessing self-reported compliance, while Commission may appoint Article 72 auditors providing independent verification unfiltered by provider selection and payment. Commission-appointed auditors perform: (1) Verification audits confirming Article 37 audit findings accuracy, (2) Focused audits examining specific compliance areas Commission identifies as high-risk, (3) Follow-up audits verifying providers implemented corrective measures after violations discovered, (4) Ongoing monitoring audits throughout Article 66 investigation periods. Auditor independence ensured through: Commission selection and payment (providers cannot influence appointment or compensation), conflict-of-interest screening (auditors with recent provider relationships excluded), scope determination by Commission not audited provider, audit report delivery to Commission with publication at Commission's discretion. Article 72 auditors access platform systems, databases, algorithms, personnel, facilities as necessary to conduct audits - providers must cooperate providing requested information and access under pain of Article 67 non-compliance penalties.

National Authority Experts - Paragraph 2: Commission may appoint 'experts and auditors from competent national authorities with the agreement of the authority concerned' - enables EU-national cooperation leveraging Member State expertise and capacity. National authority contributions include: (1) Domain expertise in specific regulatory areas (e.g., French ARCOM expertise in audiovisual content, German BfDI expertise in data protection), (2) Language and cultural knowledge enabling assessment of local content moderation and recommendation practices, (3) Existing relationships with national platforms or subsidiaries providing contextual understanding, (4) Technical infrastructure and tools developed for national enforcement adaptable to Commission monitoring. 'Agreement of the authority concerned' requirement prevents Commission from unilaterally commandeering national staff - Member States retain sovereignty over their personnel deployment. However, mutual assistance obligations and EU cooperation principles create strong expectations that national authorities will agree to expert secondments absent resource constraints or conflicting national priorities. Expert appointments typically structured as temporary secondments with costs reimbursed by Commission, though some arrangements involve in-kind contributions where national authorities absorb costs as capacity-building investment.

European Centre for Algorithmic Transparency (ECAT): Commission established ECAT in April 2023 as primary institutional mechanism implementing Article 72 monitoring powers. Located within Joint Research Centre (JRC) and collaborating closely with DG CONNECT, ECAT provides: (1) Algorithmic expertise - staff with computer science, data science, machine learning, AI backgrounds capable of analyzing platform algorithms, (2) Technical monitoring infrastructure - tools, systems, methodologies for accessing and analyzing platform databases and algorithms, (3) Research coordination - liaison with academic researchers conducting platform studies under Article 40 vetted researcher access, (4) Best practice development - guidance on algorithmic risk assessment, mitigation, transparency, auditing methodologies, (5) Training and capacity building - educating Commission enforcement staff, national Digital Services Coordinators, civil society stakeholders on algorithmic systems. ECAT's Article 72 monitoring activities include: algorithm access requests to designated VLOPs/VLOSEs, database analysis examining content moderation patterns, recommender system testing, collaboration with Article 37 independent auditors to verify findings, expert consultations supporting Article 66 investigations. ECAT represents institutionalization of Article 72 monitoring - transforms ad hoc expert appointments into systematic technical enforcement capacity.

Relationship with Article 37 Annual Audits: Article 72 Commission-appointed monitoring complements but differs from Article 37 provider-commissioned annual audits. Key distinctions: (1) Selection - Article 37 auditors selected and paid by audited platforms (creating potential independence concerns despite conflict-of-interest rules), Article 72 experts/auditors selected and paid by Commission. (2) Frequency - Article 37 audits required annually, Article 72 monitoring conducted at Commission's discretion potentially continuously. (3) Scope - Article 37 audits assess compliance with all VLOP/VLOSE obligations comprehensively, Article 72 monitoring may focus on specific areas Commission prioritizes. (4) Reporting - Article 37 audit reports submitted to Commission and Digital Services Coordinator then publicly disclosed, Article 72 monitoring findings inform internal Commission enforcement decisions with public disclosure at Commission's discretion. (5) Purpose - Article 37 audits primarily serve provider self-assessment and public accountability, Article 72 monitoring serves Commission investigation and enforcement. (6) Legal consequences - Article 37 audit failures trigger provider remediation obligations and potential Article 66 proceedings, Article 72 monitoring findings directly inform Commission enforcement decisions under Articles 70-75. Interaction creates oversight redundancy benefiting compliance: providers cannot satisfy Article 37 through superficial audits knowing Commission conducts independent Article 72 verification, while Article 72 monitoring benefits from Article 37 audit documentation identifying compliance gaps warranting deeper Commission investigation.

Procedural Safeguards and Limitations: Despite broad monitoring powers, Article 72 subject to procedural constraints: (1) Purpose limitation - monitoring must serve 'tasks assigned under this Section' not general platform surveillance or economic intelligence gathering unrelated to DSA compliance. (2) Proportionality - Charter of Fundamental Rights Article 52 requires monitoring actions proportionate to legitimate regulatory objectives - Commission cannot impose excessive document retention periods, overly intrusive database access, or burdensome expert accommodation where less restrictive alternatives available. (3) Confidentiality - Commission must protect commercially sensitive information, trade secrets, personal data obtained through monitoring per Article 79 professional secrecy obligations. (4) Due process - Providers may challenge Article 72 monitoring orders before EU Courts under TFEU Article 263 if orders exceed Commission's competence, violate procedural requirements, or constitute misuse of powers. (5) Data protection - Database access involving personal data must comply with GDPR safeguards even when conducted for regulatory enforcement. (6) Cost allocation - Article 72 monitoring costs borne by Commission from enforcement budget - providers don't pay for Commission-appointed experts unlike Article 37 audits they finance.

Temporal Dynamics: Article 72 monitoring operates continuously throughout VLOP/VLOSE designation period - not limited to discrete investigation windows. Commission may: Establish ongoing database access enabling real-time content moderation monitoring, appoint long-term expert secondments providing sustained algorithmic oversight, impose permanent document retention obligations ensuring compliance evidence always preserved, conduct regular monitoring cycles (quarterly, semi-annual) verifying sustained implementation. Continuous monitoring serves multiple functions: Early violation detection enabling preventive intervention before harms escalate, implementation verification ensuring Article 71 commitments or Article 73 remedies actually deployed not merely promised, compliance trend analysis identifying systemic weaknesses requiring regulatory attention, regulatory learning improving Commission's technical understanding of platform operations informing future policy development. Providers face sustained oversight pressure unlike episodic investigations - creates compliance culture change incentivizing proactive adherence rather than reactive violation remediation.

Information Sharing and Coordination: Article 72 monitoring findings may be shared with: National Digital Services Coordinators to inform parallel national enforcement under Article 51, Article 61 European Board for Digital Services to support consistent cross-border enforcement and best practice development, Other EU regulatory authorities (DG Competition for DMA enforcement, data protection authorities for GDPR enforcement) where monitoring reveals cross-regulatory violations, Article 40 vetted researchers to enrich academic understanding of platform systems subject to appropriate confidentiality protections. However, Article 79 professional secrecy limits sharing commercially sensitive or confidential information - Commission must balance transparency and collaboration benefits against provider privacy and competitive interests. Information sharing typically involves: Aggregated findings rather than granular platform-specific data, legal/regulatory analysis rather than technical trade secrets, de-identified examples illustrating compliance patterns without revealing specific platforms.

Key Points

  • Commission authorized to take 'necessary actions' to monitor VLOPs/VLOSEs effective implementation and compliance with DSA obligations - broad discretionary power
  • Database access power: Commission can order VLOPs/VLOSEs to provide access to databases and explanations - enables examination of content moderation records, user data, advertising databases
  • Algorithm access power: Commission can order explanations relating to algorithms - includes recommender systems, content ranking, ad targeting, automated moderation algorithms
  • Document retention obligations: Commission can require providers to preserve all documents necessary to assess compliance - prevents evidence destruction during investigations
  • Independent external experts: Commission may appoint experts to assist monitoring and provide specialized knowledge - creates external technical advisory capacity
  • Independent external auditors: Commission may appoint auditors to verify compliance - distinct from Article 37 annual audits commissioned by platforms themselves
  • National authority experts: Commission may appoint experts from Member State competent authorities 'with agreement of authority concerned' - enables EU-national cooperation
  • Purpose-limited: Monitoring actions must be 'for the purposes of carrying out tasks assigned under this Section' - restricted to VLOP/VLOSE supervision functions
  • Differs from Article 37 (platform-commissioned annual audits) - Article 72 enables Commission-appointed monitoring distinct from provider-initiated compliance audits
  • Enables proactive compliance monitoring rather than reactive investigation only - Commission can continuously oversee implementation not merely investigate violations
  • European Centre for Algorithmic Transparency (ECAT) serves as primary implementation mechanism providing technical expertise for algorithmic monitoring
  • Database/algorithm access supports Commission's Article 66 investigation powers, Article 67 information requests, Article 69 inspections - creates integrated enforcement toolkit
  • No statutory limitation on monitoring frequency or duration - Commission may conduct ongoing continuous monitoring throughout VLOP/VLOSE designation period
  • Expert appointment enables access to specialized domains (algorithmic auditing, data science, content moderation, cybersecurity, child safety, election integrity)
  • Document retention prevents spoliation - ensures evidence preservation before Commission initiates formal Article 66 proceedings or issues Article 67 information requests
  • Monitoring costs borne by Commission not providers - contrasts with Article 37 where platforms pay for their own audits

Practical Application

ECAT Algorithmic Access and Monitoring - Recommender Systems: European Centre for Algorithmic Transparency exercises Article 72 algorithm access powers to monitor VLOP compliance with Article 38 (VLOP-specific recommender system obligations). ECAT issues access requests to TikTok, Meta (Instagram, Facebook), YouTube, X (Twitter), Pinterest requiring explanations of recommender algorithms determining content visibility in user feeds. Platforms must provide: (1) Main ranking parameters - factors determining content prioritization (engagement signals, recency, user connections, content type, personalization features), (2) Weighting methodologies - relative importance assigned to different ranking signals and how weights calibrated, (3) Personalization mechanisms - how user demographics, interests, past behavior influence recommendations, (4) Non-personalized alternatives - options enabling users to view content based on chronological or non-algorithmic criteria per Article 38 requirement, (5) Testing and validation - how platforms assess whether recommender systems produce desired outcomes and avoid unintended consequences. ECAT technical staff analyze explanations evaluating: whether platforms genuinely provide meaningful non-personalized alternatives or merely cosmetic options, whether ranking parameters disclosed to users under Article 27 match actual algorithmic operation, whether recommender systems amplify systemic risks (disinformation, harmful content to minors, election manipulation) inadequately mitigated under Article 35. Analysis reveals several platforms provided incomplete or misleading explanations: TikTok claimed 'For You' feed offers chronological 'Following' alternative but ECAT testing shows 'Following' feed still algorithmically curated prioritizing high-engagement content. Instagram described main ranking parameter as 'how likely you are to interact with content' but explanation omitted commercial signals (sponsored content boosting, partnership prioritization). YouTube explanations described weighting methodology in general terms ('signals combined to predict watch time') without specificity enabling users to understand why specific videos recommended. ECAT documents findings in technical reports submitted to DG CONNECT enforcement team, which issues Article 67 information requests demanding corrected explanations and Article 66 investigation proceedings for potential Article 27/38 violations. ECAT ongoing monitoring detects when platforms modify algorithms without updating user-facing disclosures - creates continuous compliance pressure beyond annual Article 37 audits.

Database Access - Content Moderation Verification: Commission exercises Article 72 database access powers investigating Meta's compliance with Article 16 notice-and-action and Article 17 statement of reasons requirements. Following civil society reports alleging Facebook/Instagram systematically under-moderate hate speech targeting religious minorities while over-moderating political dissent content, Commission issues database access order requiring Meta to provide: complete content moderation decision records for 6-month period including content flagged (text, images, videos), moderation decision (removed, restricted, maintained), decision basis (automated detection, user report, proactive review), reviewer identity (human or automated), decision timestamp, appeal status and outcome, Article 17 statement of reasons provided to affected users. Database comprises 127 million moderation decisions across EU Member States. Commission appoints independent external experts (paragraph 2) with expertise in content moderation, hate speech detection, algorithmic bias, and human rights to analyze database. Expert analysis reveals: (1) Article 17 statement deficiencies - 34% of content removal decisions provided generic boilerplate statement of reasons ('violated Community Standards regarding hate speech') without specific explanation enabling users to understand which statement component triggered violation - fails Article 17(3) requirement for 'clear and specific statement of reasons.' (2) Inconsistent enforcement - statistically significant variation in removal rates for equivalent hate speech content depending on target group, with religious minority-targeted hate speech removed 41% less frequently than majority-targeted content despite identical Community Standards application. (3) Automated system errors - 23% of automated hate speech detections overturned on human appeal review, suggesting automated systems operating below acceptable accuracy thresholds. (4) Language bias - hate speech detection accuracy varied dramatically by language, with minority languages (Maltese, Irish, Estonian) exhibiting 58% lower accuracy than major languages (English, German, French) - indicates Article 35 systemic risk mitigation inadequately addresses linguistic diversity. Expert reports provide evidentiary foundation for Article 66 formal investigation proceedings. Commission issues Article 70 interim measures requiring Meta to: suspend automated-only content moderation for hate speech pending accuracy improvements, implement mandatory human review for all minority-targeted content pending bias correction, provide detailed Article 17 statements specifying precise policy violation rather than generic categories. Investigation ultimately results in Article 73 non-compliance decision with Article 74 fines and binding remediation requirements including: algorithmic retraining on balanced hate speech datasets, language-specific accuracy thresholds before automation deployment, quarterly bias audits with public reporting.

Independent External Auditor Appointment - Ad Transparency Compliance: Commission appoints independent external auditors under Article 72(2) to verify VLOP compliance with Article 26 advertising transparency and Article 30 ad repository requirements. Unlike Article 37 annual audits selected by platforms, Commission selects auditors with specialized advertising technology expertise, funds audits from enforcement budget, and defines audit scope prioritizing Commission enforcement concerns. Audit scope includes: (1) Verification that platforms' Article 30 ad repositories contain complete advertising records or whether platforms systematically exclude categories (political ads, microtargeted ads, rejected ads), (2) Assessment whether Article 26 'Sponsored' disclosures sufficiently prominent and unambiguous or designed to minimize user notice, (3) Evaluation whether advertising targeting parameter disclosures provide meaningful information enabling users to understand why they received ad or merely list generic categories concealing actual targeting logic, (4) Testing whether platforms provide required functionality enabling users to access advertiser information, targeting criteria, and ad delivery statistics. Auditors conduct: automated testing creating accounts with diverse demographics and behaviors then analyzing ad delivery patterns and transparency disclosures; manual review of ad repository completeness comparing against internal platform advertising databases accessed under Article 72 database powers; user comprehension testing surveying whether real users can understand targeting explanations; comparison against Article 37 audit reports identifying discrepancies. Audit findings reveal: Major e-commerce platform's Article 30 ad repository excluded 63% of microtargeted advertisements delivered to fewer than 1,000 users - violated 'all advertisements' requirement and enabled invisible targeting of vulnerable groups. Social media platform's Article 26 targeting disclosures listed 'interests' as targeting category but omitted behavioral signals (pages liked, groups joined, past purchase activity) actually determining ad delivery - provided misleading incomplete information. Search engine VLOSE Article 26 'Sponsored' labels displayed in small gray text barely distinguishable from organic results - violated transparency spirit through intentional deceptive design. Commission uses audit findings to initiate Article 66 proceedings and impose Article 70 interim measures requiring immediate correction while investigations proceed toward Article 73 decisions. Case illustrates Article 72 auditor appointments providing independent verification unconstrained by platform-auditor relationship dynamics potentially compromising Article 37 audits.

National Authority Expert Secondment - Child Safety Monitoring: Commission appoints experts from Irish Data Protection Commission (DPC) and French ARCOM under Article 72(2) with 'agreement of authority concerned' to assist monitoring Instagram and TikTok compliance with Article 28 minor protection requirements. National authority expert contributions include: Irish DPC expertise in age verification systems, data minimization for minor accounts, and default privacy settings implementation from GDPR enforcement experience; French ARCOM expertise in audiovisual content regulation, harmful content classification, and cultural context assessment from national media regulatory experience. Expert secondment structured as 6-month assignment with Commission reimbursing national authority personnel costs. Experts conduct: (1) Age verification assessment - evaluate whether platforms' Article 28(1) age verification systems adequately prevent minors from accessing adult content and services; testing reveals platforms rely primarily on user self-declaration with minimal verification - minors easily circumvent by entering false birthdates. (2) Default settings evaluation - assess whether Article 28(2) default privacy settings appropriately restrictive for minor accounts; analysis finds several platforms default minors to public profiles enabling stranger contact despite well-documented child safety risks. (3) Recommender system review - examine whether recommender algorithms adequately prevent minors' exposure to harmful content (eating disorder content, self-harm tutorials, sexual material); testing shows algorithms frequently recommend harmful content to minor accounts exhibiting vulnerability signals. (4) Commercial exploitation assessment - evaluate whether platforms adequately restrict advertising targeting minors and prevent exploitative monetization of minor-generated content. National experts' cultural and linguistic knowledge proves critical - Irish experts identify Gaelic-language communities where platform moderation virtually nonexistent creating child safety gaps, French experts detect French-specific 'thinspiration' eating disorder content evading English-trained moderation systems. Expert findings document systemic Article 28 failures submitted to Commission enforcement team. Commission issues Article 67 information requests demanding remediation plans, conducts Article 69 inspections of age verification and content moderation systems, ultimately adopts Article 71 binding commitments (both platforms prefer commitments over Article 73 formal violations) requiring: improved age verification with document verification or biometric estimation, mandatory default privacy restrictions for minor accounts with parental override required for public settings, recommender algorithm modifications preventing harmful content exposure with independent testing verification, prohibition on targeted advertising for minors with contextual advertising only. Case demonstrates national authority expert value providing specialized domain knowledge and cultural context Commission staff lack.

Document Retention Order - Preventing Evidence Spoliation: Commission initiates Article 66 investigation into major social media platform for potential Article 40 dark pattern violations - interface designs allegedly manipulating users into excessive data sharing and content engagement undermining informed consent. Platform operates continuous A/B testing program experimentally deploying different interface designs to user subsets measuring behavioral impacts. These experiments potentially document platform's knowledge of manipulative effects. However, platform's standard data retention policy automatically deletes A/B test data after 90 days to minimize storage costs and data protection liability. Commission concerned platform may delete critical evidence demonstrating dark pattern intentionality before Article 67 information requests can preserve records. Commission issues Article 72 document retention order requiring platform to preserve: (1) All A/B test designs, methodologies, results, analyses related to user interface features affecting consent, data sharing, content engagement, subscription/cancellation, terms acceptance from past 2 years, (2) Internal communications (emails, chat messages, meeting minutes) discussing interface design objectives, behavioral psychology applications, user manipulation tactics, (3) Policy documents and design guidance addressing consent flows, choice architecture, default settings, (4) Training materials provided to UX designers and product managers regarding user interface optimization, (5) Technical specifications documenting how tested features implemented in production systems. Retention order specifies: documents must be preserved in native electronic format maintaining metadata (creation date, authorship, modification history), storage must employ integrity protection preventing alteration or deletion, documents must remain accessible for Commission review upon Article 67 request or Article 69 inspection, retention obligation continues until Commission issues final decision or explicitly releases requirement. Platform challenges retention order before EU Courts arguing it imposes disproportionate storage costs and creates excessive data protection liability by preserving personal data beyond necessary retention periods. Court upholds retention order applying proportionality balancing: regulatory enforcement interest in preserving evidence of serious potential violations (dark patterns undermining consent foundation of GDPR and DSA) outweighs platform's cost and liability concerns, which can be mitigated through appropriate security and access controls limiting data exposure. Retention order proves crucial - when Commission eventually issues Article 67 information request, preserved documents reveal internal communications explicitly discussing 'maximizing user engagement even where users indicate desire to disengage' and A/B tests measuring 'resistance reduction to data sharing requests' - direct evidence of Article 40 dark pattern violations. Without Article 72 retention order, this evidence would have been automatically deleted per standard retention policy, substantially weakening Commission's enforcement case.

Ongoing Monitoring - Crisis Protocol Compliance: Article 48 requires VLOPs to activate crisis response protocols during extraordinary circumstances threatening public security or health. COVID-19 pandemic and Ukraine invasion triggered Article 48 activation. Commission establishes Article 72 ongoing monitoring to verify platforms maintain effective crisis protocols and activate them appropriately. Commission appoints ECAT experts to conduct continuous monitoring including: (1) Real-time database monitoring - platforms required to provide Commission with API access to content moderation databases enabling real-time observation of crisis-related content handling; ECAT monitors removal rates, labeling practices, amplification patterns for health misinformation, war propaganda, coordinated inauthentic behavior. (2) Algorithm transparency - platforms must explain crisis protocol algorithmic adjustments (e.g., downranking unverified sources, amplifying authoritative health/government information, detecting coordinated manipulation); ECAT staff verify claimed adjustments actually implemented by testing ranking behavior and analyzing algorithmic outputs. (3) Reporting obligations - platforms submit weekly crisis protocol reports documenting: crisis indicators monitored, protocol activation thresholds and decisions, mitigation measures deployed, effectiveness metrics, coordination with Member State authorities; Commission reviews reports identifying compliance gaps. (4) Stakeholder coordination - Commission convenes regular meetings with platforms, Member State Digital Services Coordinators, health authorities, civil society to assess crisis protocol adequacy and coordinate response improvements. Monitoring identifies several Article 48 implementation failures: Platform A deactivated certain crisis protocols after initial emergency period without assessing whether crisis conditions persisted - premature deactivation enabled resurgence of health misinformation. Platform B activated geographic restrictions in crisis protocols based on IP addresses easily evaded by VPN users - ineffective targeting allowed manipulation campaigns to continue. Platform C's crisis protocols focused exclusively on English content with minimal non-English crisis content detection - left vulnerable linguistic communities unprotected. Commission issues Article 67 information requests requiring corrective action plans, conducts Article 69 inspections verifying protocol improvements implemented, and ultimately secures Article 71 commitments from multiple platforms to: maintain crisis protocols until Commission agrees to deactivation based on objective indicators, implement robust evasion-resistant targeting using multiple signals beyond IP addresses, develop multilingual crisis content detection with minimum performance thresholds for all EU languages, provide Commission with crisis protocol activation notification within 24 hours enabling coordinated response. Ongoing Article 72 monitoring proves essential for Article 48 compliance - crisis protocols require sustained oversight not merely annual Article 37 audit snapshot given rapidly evolving threat landscape.

Cross-Border Coordination with National Authorities: Commission exercises Article 72 monitoring powers in coordination with Member State Digital Services Coordinators to address cross-border compliance issues. German DSC investigates local advertising platform operating in multiple Member States for potential Article 26 transparency violations. Investigation reveals complex multi-jurisdiction compliance challenges - platform's advertising delivery systems operate from Ireland, ad targeting algorithms developed in Poland, ad sales conducted through Netherlands subsidiary, affected users across all Member States. German DSC's national enforcement jurisdiction insufficient to address comprehensive compliance assessment requiring access to systems and personnel across multiple Member States. German DSC requests Commission to exercise Article 72 monitoring authority given cross-border nature. Commission agrees, appointing experts from German, Irish, Polish, Dutch national authorities under Article 72(2) with respective authorities' agreement. Multi-national expert team conducts: coordinated database access examining advertising records across all EU Member States' users identifying systematic targeting transparency failures affecting EU-wide audience; algorithm access evaluating targeting systems deployed from Ireland and Poland; interviews with personnel across jurisdictions under Article 68; inspections of facilities in multiple Member States under Article 69. Team findings reveal: platform implemented Article 26 transparency features for users in Member States with active DSC enforcement (Germany, France, Ireland) but deployed older non-compliant interfaces in Member States with less active oversight (Malta, Cyprus, Bulgaria) - deliberate compliance arbitrage exploiting uneven enforcement. Platform stored comprehensive targeting data in Polish servers but provided incomplete disclosures to users claiming 'technical limitations' prevented full transparency - misrepresentation contradicted by database evidence. Platform's Netherlands ad sales team trained to maximize targeting granularity without informing users of full extent - systematic evasion of Article 26. Commission's coordinated Article 72 monitoring enables comprehensive enforcement impossible through fragmented national actions. Commission issues Article 73 non-compliance decision applying EU-wide with Article 74 fines calculated on global turnover and remediation requirements applicable to all Member State users - closes compliance arbitrage opportunity. Case illustrates Article 72's cross-border monitoring capacity essential for internal market integrity preventing platforms from exploiting national enforcement boundaries.

Technical Infrastructure Development - ECAT Monitoring Tools: To implement Article 72 monitoring systematically rather than ad hoc, ECAT develops technical infrastructure enabling scaled algorithmic oversight: (1) Standardized database access protocols - technical specifications defining how VLOPs/VLOSEs provide Commission with structured access to content moderation databases, advertising records, recommender system outputs, user complaint data; protocols based on open standards (RESTful APIs, standardized data schemas) enabling automated monitoring rather than manual data requests. (2) Algorithmic testing environments - sandboxed systems where ECAT can test platform algorithms using controlled inputs to verify behavior matches provider explanations; testing includes: recommender system experiments measuring how ranking responds to different content and user attributes, content moderation algorithm testing assessing detection accuracy and bias, ad targeting algorithm testing identifying discriminatory or manipulative targeting patterns. (3) Monitoring dashboards - visualization tools enabling Commission staff to observe platform compliance metrics in real-time: content moderation volumes and removal rates by category, user complaint volumes and resolution timespans, advertising transparency feature usage, recommender system diversity metrics, crisis protocol activation status. (4) Automated compliance detection - algorithms analyzing platform data to identify potential violations warranting detailed investigation: anomalous moderation patterns suggesting bias or selective enforcement, unexplained algorithmic changes potentially evading transparency requirements, platform behavior inconsistent with Article 37 audit representations. (5) Research collaboration infrastructure - systems enabling Article 40 vetted researchers to access ECAT-collected platform data for independent analysis while protecting confidentiality - creates academic research capacity complementing Commission monitoring. ECAT infrastructure transforms Article 72 from reactive ad hoc monitoring to proactive systematic oversight. Platforms cannot predict when Commission reviewing their systems creating continuous compliance incentive. Infrastructure also enables benchmarking - Commission compares practices across VLOPs identifying outliers with particularly strong or weak compliance for targeted interventions. Technical capacity development represents multi-year investment building Article 72 into institutionalized regulatory capability rather than episodic enforcement tool. However, infrastructure development faces challenges: platforms resist providing standardized access arguing competitive sensitivity and security risks; technical complexity of algorithmic systems exceeds ECAT current expertise in some domains (cutting-edge AI systems, encrypted communications); resource constraints limit monitoring coverage requiring prioritization across designated platforms and compliance obligations.