Article 3

Definitions

For the purposes of this Regulation, the following definitions apply:

(a) 'information society service' means a service as defined in Article 1(1)(b) of Directive (EU) 2015/1535;

(b) 'recipient of the service' means any natural or legal person who uses an intermediary service;

(c) 'consumer' means any natural person acting for purposes outside their trade, business, craft, or profession;

(d) 'intermediary service' means one of the following information society services: (i) a 'mere conduit' service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network; (ii) a 'caching' service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request; (iii) a 'hosting' service that consists of the storage of information provided by, and at the request of, a recipient of the service;

(e) 'illegal content' means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law;

(f) 'online platform' means a hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation;

(g) 'online search engine' means an intermediary service that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found;

(h) 'distance contract' means distance contract as defined in Article 2(7) of Directive 2011/83/EU;

(i) 'consumer' means consumer as defined in Article 2(2) of Directive 2011/83/EU;

(j) 'to offer services in the Union' means enabling natural or legal persons in one or more Member States to use the services of the provider of the intermediary service which has a substantial connection to the Union as set out in Article 2(1);

(k) 'trader' means any natural or legal person who is acting, including through any person acting in their name or on their behalf, for purposes relating to their trade, business, craft or profession;

(l) 'terms and conditions' means all clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the service;

(m) 'active recipient of an online platform' means a recipient of the service that has engaged with an online platform by either requesting the online platform to host information or being exposed to information hosted by the online platform and disseminated through its online interface;

(n) 'content moderation' means the activities, whether automated or not, undertaken by providers of intermediary services, that are aimed, in particular, at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient's account;

(o) 'illegal content' has the meaning given in point (e);

(p) 'advertisement' means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and presented by an online platform on its online interface against remuneration specifically for promoting that information;

(q) 'recommender system' means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient of the service or otherwise determining the relative order or prominence of information displayed;

(r) 'digital services coordinator' means the authority designated by a Member State pursuant to Article 49(1);

(s) 'coordinator of establishment' means the Digital Services Coordinator of the Member State where the main establishment of a provider of an intermediary service is located or its legal representative resides or is established;

(t) 'very large online platform' means an online platform designated as such pursuant to Article 33(4);

(u) 'very large online search engine' means an online search engine designated as such pursuant to Article 33(4);

(v) 'crisis response mechanism' means the procedures and measures put in place to respond to a crisis that is threatening public security or public health as set out in Article 36;

(w) 'recipient of the service' means any natural or legal person who uses an intermediary service, in particular for the purposes of seeking information or making information accessible;

(x) 'systemic risk' means systemic risk as referred to in Article 34;

(y) 'traceability data' means data allowing the identification of traders, as referred to in Article 30;

(z) 'online interface' means any software, including a website or a part thereof, and applications, including mobile applications.

Understanding This Article

Service Classification Hierarchy and Legal Implications: Article 3's definitions create a hierarchical taxonomy of intermediary services with profound legal consequences. At the base, 'intermediary services' encompass three technical functions: mere conduit (Article 4 liability exemption), caching (Article 5 exemption), and hosting (Article 6 exemption). Each successive tier triggers additional obligations. 'Hosting' services storing user-provided information face notice-and-action requirements (Article 16), must provide statement of reasons for restrictions (Article 17), and implement transparency reporting (Article 15). 'Online platforms' - hosting services that publicly disseminate information - face all hosting obligations plus internal complaint systems (Article 20), out-of-court dispute resolution (Article 21), trusted flagger programs (Article 22), and enhanced transparency (Article 24). 'Very Large Online Platforms' and 'Very Large Online Search Engines' exceeding 45 million average monthly active EU users face the full weight of Chapter IV: systemic risk assessments (Article 34), mitigation measures (Article 35), crisis protocols (Article 36), independent audits (Article 37), recommender transparency (Article 38), advertising repositories (Article 39), researcher data access (Article 40), and compliance functions (Article 41). This graduated approach recognizes that different services present different risks and have different capacities - mere conduit providers like ISPs shouldn't bear content liability, while platforms mediating public discourse require accountability mechanisms, and dominant platforms with hundreds of millions of users warrant intensive regulation given systemic impact. Proper classification is therefore the foundational compliance question - misclassification could mean missing critical obligations or unnecessarily implementing inapplicable requirements.

Platform Definition and Ancillary Feature Exclusion: The 'online platform' definition merits careful analysis: hosting services that 'store and disseminate information to the public' qualify unless the activity is 'minor and purely ancillary' to another service. This distinction prevents excessive regulatory burden on services where public communication is incidental. A news website with comment sections stores and disseminates user comments, but if comments are purely supplementary to the news content (the primary service), the news site might not be a platform for regulatory purposes. Similarly, an e-commerce site allowing product reviews stores and disseminates user-generated reviews, but if reviews are ancillary to the primary shopping service, platform obligations might not apply to that feature. However, Recital 13 emphasizes the ancillary exception is narrow: the feature must be objectively minor (technically incapable of independent use) and integration must not be means to circumvent regulation. Social media services can't claim to be primarily something else with communication as ancillary - public dissemination IS their primary function. Recital 14 clarifies that 'dissemination to the public' means making information available to a potentially unlimited number of persons, not just closed groups. A enterprise collaboration platform (like Slack for internal company use) doesn't disseminate publicly even if it hosts information. Cloud storage for personal files (Dropbox, Google Drive) isn't public dissemination unless users actively share publicly. This public/private distinction determines platform status: public dissemination triggers platform obligations; private storage typically does not.

VLOP and VLOSE Designation Mechanics: 'Very large online platform' and 'very large online search engine' are defined by reference to Article 33(4) designation when providers reach or exceed 45 million average monthly active recipients in the EU. This threshold represents approximately 10% of the EU population (circa 450 million). 'Average monthly active recipients' includes anyone who 'engaged with' the platform by requesting information hosting OR being exposed to hosted information disseminated through the platform's interface. This definition is deliberately broad to prevent circumvention - platforms can't exclude passive viewers or logged-out users from counts. A social media platform must count users who post content (requesting hosting) AND users who browse content (exposed to hosted information). YouTube counts uploaders and viewers. Amazon's marketplace counts sellers and shoppers. The 'average' is calculated over the preceding six months, preventing temporary spikes or drops from affecting designation. Article 33(3) requires all platforms and search engines, regardless of size, to publish user numbers by 17 February annually. The Commission reviews published numbers and issues formal designation decisions identifying specific services as VLOPs/VLOSEs. Current designations (2024) include: Facebook, Instagram, TikTok, X (Twitter), YouTube, LinkedIn, Pinterest, Snapchat as VLOPs for social/content sharing; Amazon Store, Booking.com, Zalando as VLOPs for marketplaces; Google Maps, Google Play, Google Shopping, Wikipedia, AliExpress as VLOPs for other services; and Google Search and Bing as VLOSEs. Adult content platforms Pornhub (Aylo), Stripchat (Technius), and XVideos (WebGroup Czech) are also designated VLOPs. Designation triggers four-month deadline to achieve full Chapter IV compliance (Article 33(4)). Designation can be revoked if platforms drop below threshold for a full year, but temporary dips don't end designation immediately.

Illegal Content Definition and Jurisdictional Complexity: 'Illegal content' is defined with deliberate breadth: 'any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law.' This covers criminal content (child sexual abuse material, terrorist propaganda, incitement to violence), civil law violations (defamation, privacy invasion, intellectual property infringement), consumer protection breaches (misleading advertising, unsafe products), and regulatory violations (unauthorized pharmaceutical sales, unlicensed gambling). Importantly, illegality is determined by applicable law - EU-wide illegal content includes CSAM (illegal throughout EU), terrorist content (Terrorist Content Online Regulation), certain hate speech (Framework Decision 2008/913/JHA), and intellectual property infringement (Copyright Directive, Trademark Regulation). However, some content is illegal only in specific Member States - Holocaust denial is criminal in Germany and France but may be legal elsewhere; blasphemy laws vary; defamation standards differ. This creates complexity: is content illegal if prohibited in one Member State but legal in others? Recital 47 clarifies that notices and orders must specify applicable law. A German court order identifying content as illegal under German law obligates removal for German users, but may not require global removal if legal elsewhere. However, content illegal under EU-wide harmonized law (CSAM, terrorist content) must be removed globally. Providers must therefore implement geo-aware content moderation, maintain knowledge of EU and national laws, and assess illegality based on relevant jurisdiction - typically the Member State whose law applies (user location, content origin, or harm location depending on conflict of laws rules). The definition also covers 'information in relation to an activity' - not just the information itself but associated commercial activities. Selling counterfeit products isn't illegal content per se, but product listings facilitating counterfeit sales constitute illegal content under this definition.

Content Moderation Definition and Operational Scope: The definition of 'content moderation' is remarkably comprehensive, covering 'activities, whether automated or not, undertaken by providers of intermediary services, that are aimed, in particular, at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions.' This includes both technical measures and editorial decisions, both illegal content and terms violations, and both proactive detection and reactive responses. Specific examples: removal (deleting content entirely); demotion (reducing visibility in feeds or search results); demonetisation (removing advertising revenue or payment features); disabling access (making content unavailable, potentially temporarily or in specific locations); account actions (suspending or terminating user accounts); and restrictions on posting ability (rate limits, shadowbanning, feature restrictions). This broad scope ensures the DSA's content moderation requirements - statement of reasons (Article 17), internal complaints (Article 20), transparency reporting (Article 24) - apply to the full range of platform interventions. Platforms can't evade requirements by using 'soft' moderation (demotion, demonetisation) instead of removal. Automated systems like filters, hash-matching, and AI classifiers constitute content moderation triggering transparency obligations. Human review teams making moderation decisions are content moderation. Hybrid systems combining AI and human review are content moderation. The definition's breadth reflects modern reality: platforms employ numerous techniques beyond simple removal to manage content, and all should be subject to appropriate transparency, accountability, and user rights.

Recommender System Definition and Algorithmic Regulation: 'Recommender system' is defined as 'fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient of the service or otherwise determining the relative order or prominence of information displayed.' This technology-neutral definition captures diverse algorithmic systems: social media feeds (Facebook News Feed, Instagram Feed, TikTok For You Page); video recommendations (YouTube recommendations); search result rankings (Amazon product search, Google Shopping); content discovery features (Spotify recommendations, Netflix suggestions); and notification priorities. The definition includes systems responding to user-initiated searches (e.g., product search ranking algorithms) and systems autonomously pushing content (e.g., TikTok's algorithmic feed). Both 'suggesting' (actively recommending specific content) and 'prioritizing' (determining order of display) qualify. The implications are significant: Article 27 requires platforms to explain recommender system parameters in accessible terms, Article 38 requires VLOPs to provide recommender transparency and offer options not based on profiling (e.g., chronological feeds), and Article 34(1)(b) requires VLOPs to assess systemic risks from recommender systems' design and operation. These obligations recognize that algorithmic curation shapes information ecosystems, affects user behavior, and can amplify harmful content or create filter bubbles. By defining recommender systems broadly, the DSA subjects algorithmic power to transparency and accountability regardless of specific technical implementation.

Digital Services Coordinators and Enforcement Architecture: Article 3 defines 'Digital Services Coordinator' as authorities designated by Member States under Article 49(1) to supervise DSA compliance. Each Member State must designate one DSC as the competent authority, though other authorities may assist. The 'coordinator of establishment' is the DSC in the Member State where a provider's main establishment is located (or where legal representative resides for non-EU providers). Ireland's DSC is coordinator of establishment for Meta and Google's European operations; Dutch DSC for services established in Netherlands; German DSC for German-established platforms. The coordinator of establishment has primary supervision responsibility for the provider but coordinates with DSCs in other Member States where the provider operates. This structure aims to provide single primary authority (preventing fragmentation) while ensuring all affected Member States can participate in supervision. For VLOPs and VLOSEs, the Commission has exclusive competence to supervise enhanced Chapter IV obligations (Article 56), though coordinators retain competence for baseline obligations. Understanding this architecture is critical for compliance: providers must identify their coordinator of establishment, maintain relationships with all relevant DSCs, respond to coordination requests, and recognize Commission authority for VLOP/VLOSE obligations. The European Board for Digital Services, comprising all DSCs and chaired by the Commission (Article 61), facilitates consistent enforcement through guidelines, recommendations, and coordination.

Key Points

  • Establishes tiered intermediary service categories: mere conduit (transmission), caching (temporary storage for efficiency), hosting (storage at user request), online platforms (hosting with public dissemination), and VLOPs/VLOSEs (platforms/search engines exceeding 45 million monthly active EU users)
  • Online platforms defined as hosting services storing AND disseminating information publicly, excluding minor/ancillary features of other services - distinguishing platforms from private storage
  • VLOPs and VLOSEs designated under Article 33 when reaching 45 million average monthly active EU recipients (10% of EU population), triggering most extensive regulatory obligations
  • Illegal content broadly defined as any information non-compliant with EU law or Member State law compliant with EU law - includes criminal content, civil law violations, and regulatory breaches
  • Content moderation comprehensively defined to include automated and manual activities detecting, identifying, and addressing illegal content or terms violations through removal, demotion, demonetisation, access restrictions, or account actions
  • Recommender systems include any automated systems suggesting, prioritizing, or ordering information presentation - covering algorithms, search rankings, feeds, and content organization
  • Active recipients of platforms are users who requested information hosting OR were exposed to hosted information - capturing both content creators and passive consumers for user counting
  • Definitions employ technology-neutral language enabling regulation to adapt to evolving services while providing legal certainty through precise terminology

Practical Application

Service Classification in Practice - Case Examples: Proper classification requires analyzing specific service features. Gmail: Provides hosting (stores emails at user request) but typically not a platform because emails aren't disseminated publicly - it's private communication. However, if Google operates public email lists or forums, those could be platforms. Instagram: Clearly a platform - stores user photos/videos (hosting) and disseminates publicly. With over 250 million EU users, it's a designated VLOP subject to full obligations. Dropbox: Provides hosting (stores files) but typically not a platform because files are stored privately. However, Dropbox's public sharing features might create platform status for publicly-shared content. Cloudflare: Primarily caching (CDN services) and mere conduit (DNS, security services). Falls under Article 5 for caching, with limited DSA obligations. Reddit: Platform (hosts and publicly disseminates user posts/comments). Approaching VLOP threshold if EU users near 45 million. Slack: Hosts information but disseminates to closed enterprise groups, not public - likely not a platform due to private nature. GitHub: Platform for code repositories - stores and publicly disseminates code, issues, discussions. WordPress.com: Provides hosting for blogs; blogs publicly disseminate content, making WordPress a platform. Individual self-hosted WordPress sites are separate. Amazon AWS: Infrastructure hosting, typically not a platform because customers control what they host and whether it's public. Amazon's retail marketplace IS a platform. TikTok: Designated VLOP, stores and disseminates short videos publicly, exceeds 100 million EU users. These classifications determine compliance obligations - hosting providers implement Articles 16-17; platforms add Articles 20-24; VLOPs implement full Chapter IV.

VLOP User Counting Methodology: Platforms approaching 45 million EU users must implement accurate counting methodologies. Article 33's implementing regulation specifies counting methods: include all users who engaged with the platform in the counting period, whether logged-in users, logged-out users, or guest users; count unique users (same person using multiple accounts counts once if identifiable, but platforms may count conservatively treating separate accounts as separate users); include users who posted content (requested hosting) or viewed content (exposed to hosted information); exclude bots and fraudulent accounts identified and removed; calculate average over six months to smooth fluctuations; count only EU users (requiring geo-location based on registration data, IP addresses, or other indicators); and publish numbers by 17 February annually. Practical implementation: Data Collection: Analytics systems must track monthly active users by geography. Bot Detection: Implement systems identifying non-human accounts to exclude from counts. Privacy Compliance: Ensure user counting complies with GDPR - aggregated statistics generally permissible, but detailed user tracking requires legal basis. Geographic Attribution: Determine user location through registration information (addresses, phone numbers), connection data (IP geolocation), language preferences, or payment information. Edge Cases: Travelers create complexity - EU resident traveling abroad might appear as non-EU user based on IP address. VPN users may appear in wrong locations. Platforms should document methodologies and assumptions. Publication: By 17 February annually, publish user numbers through points of contact, potentially in transparency reports, or directly to the Commission. Verification: Commission may request supporting data or methodology details to verify accuracy. Under-reporting risks penalties; over-reporting may trigger unnecessary designation.

Illegal Content Determination and Moderation Decisions: The broad illegal content definition requires platforms to assess legality across multiple jurisdictions and legal domains. Practical approach: EU-Wide Clearly Illegal Content: Child sexual abuse material (CSAM) - illegal throughout EU, requires immediate removal and reporting to law enforcement under eIDAS and national laws. Terrorist content - Terrorist Content Online Regulation requires one-hour removal. Certain hate speech - Framework Decision 2008/913/JHA criminalizes incitement to violence/hatred based on race, religion, ethnicity across EU. Copyright infringement - Copyright Directive provides EU-wide protection requiring takedown of infringing content. Member State Specific Illegality: Holocaust denial - criminal in Germany, France, Austria; may be legal expression elsewhere. Platforms facing removal orders from German authorities should remove for German users but may maintain availability elsewhere unless ordered otherwise. Blasphemy - varies by Member State; platform must apply relevant jurisdiction's law. Defamation - standards differ; must assess under applicable national law. Civil vs. Criminal: Illegal content includes both criminal content (requiring law enforcement cooperation) and civil violations (addressed through notice-and-action and judicial proceedings). Legal Assessment Process: When receiving notices under Article 16, platforms should: verify notice completeness (Article 16 specifies required elements); assess apparent illegality based on provided information and platform's knowledge; consider applicable jurisdiction (which Member State's law applies); for complex cases, seek legal advice or await judicial determination; and document decision-making process for transparency reporting and potential appeals. Platforms aren't courts and can't make definitive illegality determinations, but must make good-faith assessments based on available information. Obvious illegality (CSAM, terrorist propaganda with clear indicators, blatant copyright infringement with rights holder notice) warrants expeditious action. Borderline cases (satire vs. defamation, fair use vs. infringement, political speech vs. hate speech) may warrant human review, consideration of fundamental rights, and conservative approaches favoring expression when illegality isn't clear.

Content Moderation Documentation and Transparency: The comprehensive content moderation definition requires extensive documentation for transparency reporting (Article 15) and statement of reasons (Article 17). Platforms should implement: Moderation Logging: Database recording all content moderation actions including content identifier, action taken (removal, demotion, demonetization, account restriction), action date/time, decision-maker (automated system name, human moderator ID), legal basis (Article 16 notice, Article 9 order, terms violation, proactive detection), illegality category if applicable (hate speech, CSAM, copyright infringement, etc.), and user notification status. Automated System Documentation: For AI/automated moderation, document system purpose, operation methodology, accuracy rates, error handling, and human review triggers. Article 15(1)(d) requires transparency about automated means used for content moderation. Terms Violation Categories: Categorize terms violations for reporting - violence, adult content, harassment, spam, impersonation, etc. Action Types: Track different moderation actions separately - complete removal differs from demotion which differs from demonetization; transparency reports should distinguish. Appeal Handling: Log internal complaints under Article 20, decisions on complaints, reversal rates, and out-of-court dispute resolution outcomes. Reporting Compilation: Aggregate data for annual transparency reports showing volumes, categories, outcomes, and processing times. Statement of Reasons: Generate clear explanations for each moderation action as required by Article 17 - must explain decision, specify applicable rule, reference complaint mechanisms, and inform users of redress options. Exemptions: Very small providers may have simplified requirements, but larger platforms and VLOPs must maintain comprehensive systems. Audit Trail: Documentation supports Article 37 audits for VLOPs and enables compliance verification by supervisory authorities.

Recommender System Transparency and User Choice: Article 27 and 38's recommender system requirements require platforms to: Explanation Requirements (Article 27): Terms and conditions must explain main parameters determining information suggested/prioritized - for YouTube, explain how watch history, search history, channel subscriptions, and engagement signals influence recommendations; for Instagram, explain how follows, likes, comments, shares, and watch time affect feed ordering; for TikTok, explain For You Page algorithm factors. Explanations must be 'in plain and intelligible language' accessible to average users, not technical documentation. VLOP Additional Requirements (Article 38): VLOPs must provide at least one option for each recommender system not based on profiling - chronological feeds, random ordering, or user-selected criteria. Allow users to modify or influence parameters recommender systems use. Provide easily accessible functionality to enable/disable profiling-based recommendations. Practical Implementations: Instagram offers 'Following' feed showing posts chronologically from followed accounts alongside algorithmic 'Home' feed. Twitter/X provides 'For You' (algorithmic) and 'Following' (chronological) tabs. TikTok could offer chronological or topic-based alternatives to For You Page. YouTube might offer subscription-based chronological view alongside recommendations. User Controls: Settings allowing users to indicate topic interests, block certain content types, reset recommendation training, or view/delete data used for recommendations. Documentation: Maintain internal documentation of recommender system design, parameters, and tuning for Article 37 audits and Article 34 risk assessments examining algorithmic amplification effects. Testing: Evaluate whether recommender alternatives are genuinely meaningful (not deliberately degraded to push users toward preferred option) and actually available (not hidden in obscure settings).

Coordinator Relationship Management and Enforcement Coordination: Providers must establish relationships with relevant Digital Services Coordinators. Identify Coordinator of Establishment: Provider's main establishment location determines primary DSC - Meta's Irish headquarters means Irish DSC is coordinator of establishment. Non-EU providers' legal representative location determines coordinator - if TikTok designates Irish representative, Irish DSC coordinates. Designate Points of Contact: Article 11 requires electronic point of contact for authorities - ensure coordinator of establishment and all other DSCs can reach designated contact. Respond to Information Requests: DSCs may request information about services, users, operations, or compliance measures - maintain processes to respond promptly and completely. Coordinate Cross-Border Issues: When French users report illegal content on Irish-established platform, French DSC may contact Irish DSC (coordinator of establishment) requesting action - platform receives requests through coordinator of establishment but must consider all affected Member States. Commission Relationship for VLOPs: VLOPs face Commission supervision for Chapter IV obligations - designate separate contact channel for Commission, respond to Commission information requests within specified deadlines (often tight), cooperate with Commission investigations and audits, and implement Commission decisions on compliance measures or fines. Board Participation: While providers don't directly participate in European Board for Digital Services, Board issues guidelines affecting compliance - monitor Board recommendations, opinions, and guidance documents. Multi-Authority Coordination: Large providers may simultaneously interact with multiple DSCs (for different Member State operations), the Commission (for VLOP obligations), data protection authorities (for GDPR), consumer protection authorities, and competition authorities - implement internal coordination ensuring consistent responses and flagging conflicts between different authority requirements.