Article 33

Very large online platforms and very large online search engines

1. This Section shall apply to providers of online platforms and providers of online search engines that have an average of 45 million or more average monthly active recipients of the service in the Union, calculated in accordance with the methodology set out in paragraphs 2 and 3 ('very large online platforms' or 'very large online search engines').

2. Providers of online platforms or providers of online search engines that have an average of 45 million or more average monthly active recipients of the service in the Union shall publish the average monthly active recipients of the service in the Union on a prominent place on their online interface, in an easily accessible, clear and comprehensible manner. The information shall be updated at least every six months and indicate the number for the period of the preceding six months, the methodology applied for its calculation and the date of publication.

3. For the purposes of calculating the number of average monthly active recipients of the service in the Union, the methodology in Article 24(2) shall apply.

4. The Commission may, on its own initiative or upon a substantiated application by the provider concerned, request the Digital Services Coordinator of the Member State of establishment of the provider of online platforms or of online search engines referred to in paragraph 2 to provide its views on whether that provider meets the threshold referred to in paragraph 1. Where the Commission considers, following the views provided by the Digital Services Coordinator concerned, that the provider meets that threshold, the Commission shall adopt a decision designating that provider as a very large online platform or a very large online search engine, as applicable, in accordance with the examination procedure referred to in Article 88(2).

5. The obligations laid down in Articles 34 to 43 shall apply from four months after the designation referred to in paragraph 4.

6. A very large online platform or very large online search engine which considers that it no longer meets the threshold referred to in paragraph 1 may submit a substantiated application to the Commission requesting the Commission to revise its designation. The Commission shall take a decision on the substantiated application within five months of receiving it. Where appropriate, the Commission shall adopt a decision revoking the designation, in accordance with the examination procedure referred to in Article 88(2).

Understanding This Article

Article 33 establishes the 'Very Large Online Platform' (VLOP) and 'Very Large Online Search Engine' (VLOSE) categories - the DSA's most heavily regulated platforms. The 45 million monthly active EU users threshold identifies platforms with such scale and societal influence that they warrant heightened regulatory scrutiny. To contextualize: EU population is ~450 million, so 45 million = roughly 10% of population actively using the platform monthly. These aren't just big platforms - they're platforms reaching critical mass where their content moderation failures, algorithm choices, and crisis responses have society-wide implications.

Paragraph 1 defines the threshold: 45 million 'average monthly active recipients of the service in the Union'. This uses Article 24(2)'s methodology: count unique users who engaged with service at least once per month, averaged over 6-month period. 'Recipients' means actual service users, not just account holders - if someone creates Instagram account but never uses it, they don't count. The EU territorial scope is critical: global platforms count only their EU users. TikTok has 1+ billion global users, but only ~150 million EU users - EU number determines VLOP status.

Paragraph 2 establishes transparency obligation: platforms approaching threshold must publish their EU user numbers 'on prominent place' in 'easily accessible, clear and comprehensible manner', updated every 6 months. This prevents platforms from hiding just below threshold to evade designation. The publication must include: user number, calculation methodology, and publication date. If YouTube publishes '48 million average monthly active EU users (January-June 2024), calculated per Article 24(2) methodology, published July 15 2024', competitors, regulators, and public can verify VLOP status.

Paragraph 4 establishes designation process: Commission adopts formal decision designating platform as VLOP/VLOSE. Commission can act 'on its own initiative' (if platform obviously exceeds threshold but hasn't reported) or 'upon substantiated application by provider' (platform self-reports exceeding threshold). Commission requests views from Digital Services Coordinator of platform's Member State of establishment before designating. This consultation ensures Member State regulator familiar with platform confirms threshold met. Designation is official act with legal effect - triggers Articles 34-43 obligations.

Paragraph 5's 4-month grace period recognizes platforms need time to implement heightened obligations: systemic risk assessments (Article 34), risk mitigation measures (Article 35), independent audits (Article 37), crisis response mechanisms (Article 38), recommender system transparency (Article 27), etc. Four months from designation date, full VLOP obligations apply. If Facebook designated January 1, full obligations apply May 1.

Paragraph 6 allows de-designation: if VLOP's EU users drop below 45 million and stays below for sustained period, platform can apply for revocation. Commission has 5 months to decide. This prevents permanent designation when platforms genuinely shrink (e.g., if MySpace-style decline occurs). However, Commission likely requires substantial sustained decrease, not temporary fluctuation.

Article 33's significance is systemic: it creates two-tier platform regulation. Articles 1-32 apply to all intermediaries/platforms. Articles 33-43 add heightened obligations for VLOPs - platforms whose scale makes them 'systemically important'. This mirrors financial regulation (systemically important banks face stricter rules) and telecommunications (dominant operators face special obligations). Facebook, YouTube, TikTok, Twitter/X aren't just large - they're platforms where content moderation failures can influence elections, amplify health misinformation, or facilitate mass violence. Article 33 empowers EU to impose commensurate obligations.

Key Points

  • VLOP/VLOSE = platform/search engine with 45+ million average monthly active EU users
  • 45 million ≈ 10% of EU population (450 million people)
  • Platforms must publish user numbers every 6 months if approaching threshold
  • Commission designates platforms as VLOP/VLOSE via formal decision
  • Heightened obligations (Articles 34-43) apply 4 months after designation
  • Platforms can apply for de-designation if user numbers drop below threshold
  • Triggers: systemic risk assessments, independent audits, crisis protocols, recommender system transparency
  • Current VLOPs include: Facebook, Instagram, YouTube, TikTok, Twitter/X, LinkedIn, Amazon, etc.

Practical Application

For Meta/Facebook (Clear VLOP): Facebook has ~300+ million EU users across 27 Member States - vastly exceeds 45 million threshold. Meta must: (1) Publish EU user numbers for Facebook, Instagram, WhatsApp separately every 6 months on platform homepages; (2) Self-report to Commission confirming VLOP status; (3) Prepare for Commission designation triggering Articles 34-43; (4) Within 4 months of designation, implement: annual systemic risk assessments identifying risks from content algorithms, disinformation, coordinated manipulation; risk mitigation measures addressing identified risks; independent audit processes; crisis response protocols for election interference, public health emergencies; enhanced recommender system transparency; data access for researchers; compliance function with dedicated resources. For Meta, VLOP designation represents fundamental shift from self-regulation to mandatory risk management framework with independent oversight.

For YouTube (Google-Owned VLOP): YouTube has ~400+ million EU users - clearly exceeds threshold. YouTube must publish EU user numbers prominently (e.g., YouTube homepage footer: '405 million average monthly active EU users (Jan-Jun 2024)'). Following Commission designation, YouTube faces heightened content moderation obligations: (1) Systemic risk assessment analyzing how recommendation algorithms may amplify extremism, election interference, child safety risks; (2) Risk mitigation: adjust algorithms to reduce harmful recommendations, improve detection of coordinated inauthentic behavior, enhance child protection measures; (3) Independent audits: third-party auditors review YouTube's risk management, compliance systems; (4) Crisis protocols: if EU election or public health emergency occurs, activate rapid response mechanisms coordinating with authorities; (5) Recommender transparency: users must have option to see videos in chronological/non-personalized feed, not just algorithmic recommendations; (6) Researcher access: provide vetted researchers API access to study platform's societal impacts. VLOP obligations force YouTube to treat content curation as public responsibility, not just engagement optimization.

For TikTok (Borderline/Clear VLOP): TikTok has ~150 million EU users - clearly exceeds 45 million. As Chinese-owned platform, TikTok faces additional scrutiny. Must publish EU user numbers transparently. After Commission designation, implement full VLOP obligations within 4 months: (1) Systemic risk assessment particularly focused on: algorithm amplification of harmful content, data flows to China, potential foreign influence on EU discourse, child safety (platform popular with minors); (2) Risk mitigation measures: algorithm adjustments, enhanced data localization, foreign influence detection, minor protection strengthening; (3) Independent audits: EU-based auditors assess compliance (addressing concerns about Chinese government influence on auditing); (4) Crisis protocols: coordinate with EU authorities during elections, public health events; (5) Transparency: explain recommendation algorithm logic, enable non-algorithmic feed option. For TikTok, VLOP designation intensifies geopolitical scrutiny - EU asserting regulatory sovereignty over platform with unclear Chinese government connections.

For Twitter/X (Post-Takeover VLOP): Twitter/X has ~100+ million EU users - exceeds threshold. Under Elon Musk ownership (2022+), platform faces particular compliance challenges given ownership changes, staff reductions, policy shifts. Must maintain EU user number publication. Following designation, implement Articles 34-43: (1) Systemic risk assessment analyzing risks from reduced moderation staff, 'Twitter Blue' paid verification changes, relaxed content policies, mass account reinstatements; (2) Risk mitigation addressing disinformation spread, coordinated manipulation, hate speech amplification; (3) Independent audits examining whether post-takeover changes increased systemic risks; (4) Crisis response: demonstrate effective protocols despite staff cuts; (5) Algorithm transparency: explain how algorithmic timeline prioritizes content, provide chronological option. Twitter's VLOP status under new ownership tests whether DSA can compel responsible platform governance despite owner preferences for minimal moderation.

For LinkedIn (Professional Network VLOP): LinkedIn has ~200+ million EU users - exceeds threshold. As professional/business-focused platform, LinkedIn's systemic risks differ from consumer social media. VLOP obligations tailored to context: (1) Systemic risk assessment: analyze risks of professional misinformation (fake credentials, job scams, business fraud), employment discrimination from algorithms, data misuse for corporate espionage; (2) Risk mitigation: enhanced verification of professional credentials, bias testing in job recommendation algorithms, data security for sensitive business information; (3) Audits focused on employment fairness, professional integrity; (4) Crisis protocols: coordinate with authorities if economic crisis, mass layoffs create vulnerability to scams; (5) Recommender transparency: explain how platform recommends job postings, professional connections. LinkedIn's VLOP status recognizes professional networks have systemic impacts on labor markets, business trust, distinct from entertainment platforms.

For Smaller Platforms Near Threshold (40-44 Million Users): Platforms approaching 45 million threshold face decision: stay below through geographic restrictions, or embrace growth and VLOP obligations. Platform with 43 million EU users considering expansion to 2 more Member States recognizes designation would trigger. Options: (1) Stay below threshold: limit service availability, implement user caps, focus on non-EU growth; (2) Embrace VLOP status: accept designation, invest in compliance infrastructure, compete with major platforms. For venture-backed startups, VLOP compliance costs may be prohibitive, creating effective regulatory barrier to scaling. For mature platforms, VLOP designation may be manageable cost of EU market access. Article 33 threshold creates strategic inflection point for platform growth strategies.

For Google Search (Clear VLOSE): Google Search clearly exceeds 45 million EU users - designated as VLOSE (Very Large Online Search Engine). VLOSE obligations parallel VLOP obligations but tailored to search context: (1) Systemic risk assessment analyzing how search results may amplify disinformation, enable illegal content discovery, disadvantage certain content providers; (2) Risk mitigation: algorithm adjustments to reduce harmful/illegal content visibility, transparency in ranking criteria; (3) Independent audits of search quality and compliance; (4) Crisis protocols for election misinformation, public health searches; (5) Recommender/ranking transparency explaining search result ordering. Google Search's VLOSE status acknowledges search engines are information gatekeepers - their choices about what to show first shape public knowledge, require accountability.