Article 1

Subject matter

1. The aim of this Regulation is to contribute to the proper functioning of the internal market for intermediary services by setting out harmonised rules for a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter, including the principle of consumer protection, are effectively protected.

2. This Regulation lays down harmonised rules on the provision of intermediary services in the internal market. In particular, it establishes:

(a) a framework for the conditional exemption from liability of providers of intermediary services;

(b) rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services;

(c) rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities.

Understanding This Article

Foundational Purpose and Legal Context: Article 1 serves as the cornerstone provision of Regulation (EU) 2022/2065, known as the Digital Services Act (DSA), establishing its fundamental objectives and regulatory scope. Adopted on 19 October 2022 and fully applicable from 17 February 2024, the DSA represents the most comprehensive reform of digital services regulation in over two decades, replacing and updating the liability framework established by the e-Commerce Directive 2000/31/EC. The regulation recognizes that the digital landscape has fundamentally transformed since 2000: social media platforms, video-sharing services, app stores, online marketplaces, cloud services, and search engines have become central to economic activity, democratic discourse, and social interaction. However, these services also created unprecedented challenges around illegal content distribution, systemic risks to democracy and public discourse, algorithmic amplification of harmful content, disinformation campaigns, and the concentration of platform power affecting millions of users.

Three-Pillar Regulatory Framework: The DSA establishes a comprehensive three-pillar approach to intermediary service regulation. First, it maintains and codifies the conditional liability exemptions for 'mere conduit' (Article 4), 'caching' (Article 5), and 'hosting' (Article 6) services, ensuring intermediaries aren't automatically liable for user-generated content. These exemptions protect the internet's foundational infrastructure - ISPs transmitting data, CDNs improving performance, and platforms hosting user content - from crushing liability that would make their business models impossible. However, these exemptions are conditional: providers must act expeditiously upon gaining actual knowledge of illegal content, and the exemptions don't prevent court-ordered injunctions. Second, the DSA introduces graduated due diligence obligations that scale proportionately with provider size and societal impact. All intermediary services face baseline transparency and contact requirements (Articles 11-15). Hosting services must implement notice-and-action mechanisms and transparency reporting (Articles 16-17). Online platforms face additional obligations including internal complaint systems, out-of-court dispute resolution, and trusted flagger programs (Articles 20-22). Very Large Online Platforms and Very Large Online Search Engines exceeding 45 million average monthly active EU users face the most stringent requirements: annual systemic risk assessments, risk mitigation measures, independent audits, compliance functions, crisis response mechanisms, and researcher data access (Articles 34-42). Third, the regulation establishes a sophisticated enforcement architecture involving Digital Services Coordinators in each Member State coordinating national enforcement, the European Board for Digital Services ensuring consistent application, and exclusive Commission competence to supervise VLOPs/VLOSEs for their enhanced obligations, with penalties reaching up to 6% of global annual turnover.

Full Harmonisation and Internal Market Integration: Article 1's reference to 'harmonised rules' signals that the DSA employs maximum harmonisation, not minimum harmonisation. Member States cannot adopt additional national requirements in areas covered by the DSA beyond what the regulation explicitly permits. This creates uniform rules across the EU internal market, preventing fragmentation where platforms face 27 different regulatory regimes. Full harmonisation provides legal certainty for businesses operating cross-border, enables economies of scale for compliance investments, and ensures users receive consistent protection regardless of location. This approach contrasts with the e-Commerce Directive's minimum harmonisation, which allowed Member States to impose stricter rules, creating regulatory fragmentation. However, the DSA explicitly preserves national competence in areas outside its scope, such as criminal law provisions addressing specific types of illegal content, competition law, data protection (GDPR continues to apply fully), consumer protection law (though the DSA coordinates with it), and intellectual property enforcement (the Copyright Directive remains applicable).

Fundamental Rights Integration: The explicit reference to protecting 'fundamental rights enshrined in the Charter' reflects the DSA's constitutional dimension. The regulation must be interpreted and applied consistently with the Charter of Fundamental Rights of the European Union, particularly Articles 7 (privacy), 8 (data protection), 11 (freedom of expression and information), 16 (freedom to conduct business), 21 (non-discrimination), and 38 (consumer protection). This creates a complex balancing exercise: content moderation protecting users from illegal and harmful content must not disproportionately restrict lawful speech; transparency requirements must respect business confidentiality and data protection; enforcement actions must observe due process and proportionality. Recital 9 emphasizes that the DSA aims to ensure that the digital environment remains a safe space for the exercise of fundamental freedoms, particularly freedom of expression and information, freedom and pluralism of the media, and other rights and freedoms guaranteed by the Charter, including the right to respect for private and family life, the right to the protection of personal data, the right to non-discrimination and the rights of the child. This fundamental rights framework distinguishes European digital regulation from approaches in other jurisdictions that may prioritize different values.

Relationship to Other EU Legislation: While Article 1 establishes the DSA's scope, the regulation operates within a broader ecosystem of EU digital and consumer law. The General Data Protection Regulation (GDPR) governs personal data processing - the DSA complements GDPR by addressing platform governance, content moderation, and systemic risks, while GDPR continues governing data protection. The ePrivacy Directive addresses electronic communications privacy. The Digital Markets Act (DMA) targets anti-competitive practices by 'gatekeepers' with significant market power - many VLOPs are also DMA gatekeepers, facing both regimes. The Copyright Directive, particularly Article 17, addresses copyright liability for user-uploaded content on sharing platforms - it coordinates with but is not replaced by the DSA. Sector-specific regulations like the Audiovisual Media Services Directive (AVMSD), E-Commerce Regulation, and Payment Services Directive continue applying in their domains. The DSA also references and coordinates with the Terrorist Content Online Regulation, which mandates one-hour removal of terrorist content. Understanding these interactions is crucial for comprehensive compliance.

Innovation and Proportionality: Article 1's dual commitment to safety and innovation reflects a deliberate policy choice: regulation should address harms without stifling legitimate business models, technological development, or market entry. The graduated approach ensures small startups aren't crushed by compliance costs designed for tech giants. Article 2(2) exempts micro and small enterprises (fewer than 50 employees and less than €10 million annual turnover) operating in a single Member State from many obligations. Article 16(4) and other provisions provide simplified requirements for smaller providers. VLOPs/VLOSEs face extensive obligations precisely because their scale creates systemic risks justifying intensive regulation, while smaller platforms face proportionate requirements. This approach contrasts with one-size-fits-all regulation that might advantage incumbents by creating barriers to entry. The innovation principle also requires interpreting obligations technologically neutral ways, not mandating specific solutions but allowing providers to achieve compliance through various means suited to their services.

Key Points

  • Establishes comprehensive, fully harmonised EU-wide rules for intermediary services to ensure legal certainty and consistent application across all Member States
  • Creates a graduated regulatory framework with obligations scaled to service type, size, and systemic importance - from basic requirements for all intermediaries to extensive obligations for Very Large Online Platforms
  • Maintains and modernizes the conditional liability exemption framework from the e-Commerce Directive (Articles 12-14) through Articles 4-6 DSA, protecting intermediaries from automatic liability for user-generated content
  • Introduces tailored due diligence obligations addressing illegal content, systemic risks, transparency, and accountability - particularly stringent requirements for platforms exceeding 45 million EU users
  • Establishes comprehensive enforcement architecture involving Digital Services Coordinators at national level, the European Board for Digital Services for coordination, and exclusive Commission supervision of VLOPs/VLOSEs
  • Protects fundamental rights enshrined in the Charter including freedom of expression, privacy, data protection, consumer protection, and non-discrimination while enabling necessary content moderation
  • Facilitates innovation and cross-border digital business growth through regulatory clarity, proportionate obligations exempting micro/small enterprises from certain requirements, and single market harmonisation
  • Addresses contemporary digital challenges including illegal content dissemination, disinformation, algorithmic amplification, systemic risks, and platform power that emerged since the 2000 e-Commerce Directive

Practical Application

Service Classification and Obligation Determination: The first practical step for any intermediary service provider is determining which DSA category applies, as this determines all applicable obligations. A company providing ISP services transmitting user data qualifies as 'mere conduit' (Article 4), facing minimal DSA obligations - primarily points of contact (Articles 11-12) and transparency reporting (Article 15). A Content Delivery Network like Cloudflare or Akamai performing caching qualifies under Article 5, adding requirements to comply with update rules and expeditiously remove cached content when source content is removed. A cloud storage service like Dropbox storing user files qualifies as 'hosting' under Article 6, requiring notice-and-action mechanisms (Article 16) and statement of reasons for content restrictions (Article 17). A social media platform like Instagram publicly disseminating user content qualifies as an 'online platform,' triggering Section 3 obligations including internal complaint systems (Article 20), out-of-court dispute resolution (Article 21), and additional transparency reporting (Article 24). If that platform exceeds 45 million average monthly active EU users, it becomes a VLOP, triggering Chapter IV's extensive requirements including risk assessments (Article 34), mitigation measures (Article 35), independent audits (Article 37), compliance function (Article 41), and crisis protocols (Article 36). Many providers offer multiple service types - Amazon provides hosting (cloud storage through AWS), operates an online platform (marketplace), and likely caching services - each service type triggers corresponding obligations, requiring careful analysis of which obligations apply to which services.

VLOP/VLOSE Designation and Compliance Timeline: Platforms and search engines reaching or approaching 45 million average monthly active EU users must prepare for VLOP/VLOSE designation. The threshold represents approximately 10% of the EU population (about 450 million). 'Average monthly active recipients' includes anyone engaging with the service at least once per month, including logged-in users, guest users, and app users. Providers must publish user numbers by 17 February annually (Article 33(3)). The Commission reviews published numbers and issues formal designation decisions. Current designated VLOPs include: Facebook, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, X (formerly Twitter), YouTube, Amazon Store, Booking.com, Google Maps, Google Play, Google Shopping, Zalando, Wikipedia, AliExpress, and adult content platforms Pornhub, Stripchat, and XVideos. Designated VLOSEs include Google Search and Bing. Upon designation, providers have four months to achieve full compliance with enhanced obligations (Article 33(4)). This tight timeline requires advance preparation: establishing risk assessment methodologies, implementing mitigation tools, recruiting compliance officers, developing audit procedures, creating researcher data access protocols, and documenting all processes. Failure to comply exposes providers to Commission enforcement proceedings and fines up to 6% of global annual turnover - potentially billions for the largest platforms.

Compliance Program Architecture for Different Provider Sizes: Micro and small enterprises operating in a single Member State benefit from Article 2(2)'s exemption but should still implement basic compliance: designate points of contact, publish clear terms of service, and maintain minimal records. Medium-sized hosting providers (e.g., a blog platform with 100 employees serving multiple Member States) must implement: electronic points of contact for authorities (Article 11) and users (Article 12), legal representatives if non-EU (Article 13), clear and accessible terms explaining content policies and moderation procedures (Article 14), annual transparency reports on content moderation activities (Article 15), notice-and-action mechanisms for illegal content reports (Article 16), statement of reasons when restricting content (Article 17), and procedures to address misuse of reporting systems (Article 22). Large platforms not yet VLOPs (e.g., a social network with 30 million EU users) add: internal complaint systems allowing users to challenge moderation decisions (Article 20), engagement with certified out-of-court dispute resolution bodies (Article 21), trusted flagger programs prioritizing notices from specially vetted entities (Article 22), and enhanced transparency reporting including decision volumes, response times, and outcomes (Article 24). VLOPs implement comprehensive governance: annual systemic risk assessments by April each year identifying and analyzing risks related to illegal content dissemination, fundamental rights impacts, electoral process integrity, minors' wellbeing, mental health impacts, gender-based violence, and public security (Article 34); reasonable, proportionate, and effective mitigation measures addressing identified risks (Article 35); crisis response mechanisms activated during public emergencies (Article 36); independent audits conducted annually by certified organizations (Article 37); recommender system transparency and user choice (Articles 27, 38); online interface design ensuring users can make free and informed decisions (Article 25); advertising transparency including real-time repositories of all ads (Article 39); researcher data access enabling vetted researchers to study systemic risks (Article 40); compliance function led by an independent compliance officer reporting directly to management and supervisory bodies (Article 41); and additional reporting covering implementation of all risk mitigation measures.

Cross-Border Operations and Legal Representative Requirements: Non-EU providers offering services to EU users must designate legal representatives within the EU (Article 13). For services offered EU-wide, one representative suffices, but that representative must be accessible to all Member State authorities. VLOPs and VLOSEs must designate representatives in each Member State where they offer services. The legal representative acts as the provider's agent for DSA matters, authorized to receive and respond to orders, decisions, notices, and communications from authorities, Digital Services Coordinators, the Commission, the Board, and users. The representative doesn't shield the provider from liability - Article 13(5) explicitly states providers remain fully responsible even if representatives fail to act. Practical implementation involves selecting an appropriate legal entity or law firm in the EU with sufficient capacity and expertise, executing a formal written mandate empowering them to act on DSA matters, ensuring they can be contacted electronically and by post, publishing their contact details publicly, establishing communication channels between the representative and the provider's global compliance teams, and providing representatives with resources and authority to respond effectively. US companies like Meta, Google, Microsoft, and Amazon typically designate their Irish or Dutch subsidiaries as legal representatives given existing corporate presence, but could also engage law firms. Chinese companies like TikTok and Shein must establish EU legal representation specifically for DSA compliance.

Fundamental Rights Compliance in Content Moderation: Article 1's commitment to protecting fundamental rights creates practical obligations affecting content moderation policies and enforcement. When drafting terms of service under Article 14, providers must ensure content restrictions respect freedom of expression - policies can't be so vague that lawful speech is chilled. When implementing notice-and-action under Articles 16-17, providers must balance removal of illegal content against wrongful takedowns suppressing legitimate expression - this requires careful review procedures, human oversight for complex cases, and clear appeals processes. When conducting Article 34 risk assessments, VLOPs must analyze fundamental rights impacts including whether content moderation disproportionately affects particular groups (raising non-discrimination concerns), whether recommender systems amplify harmful content affecting minors or mental health, whether advertising practices violate privacy or exploit vulnerabilities, and whether platform design manipulates user choices. Mitigation measures under Article 35 must address these risks proportionately - if assessment reveals recommender algorithms amplify election misinformation, mitigation might include reducing viral amplification during electoral periods, providing chronological feed options, or adding context to disputed claims. Throughout, providers must document fundamental rights considerations, conduct impact assessments for new features or policy changes, and be prepared to justify decisions to users, oversight bodies, and courts. The European Data Protection Board (EDPB) has published guidelines on DSA-GDPR interplay, emphasizing that personal data processing for content moderation must comply with GDPR's lawfulness, fairness, transparency, and data minimization principles while pursuing DSA's content moderation objectives.

Enforcement Interaction and Authority Cooperation: The Article 1(2)(c) enforcement framework creates practical implications for providers. Digital Services Coordinators (DSCs) designated by each Member State have primary supervision authority over intermediaries established in their territory. Ireland's DSC supervises Meta and Google's European headquarters; Dutch DSC supervises services established there; German DSC supervises German-established services. However, DSCs have limited competence over services established elsewhere - they can require action affecting their territory but need to coordinate with establishment-state DSCs for broader enforcement. The European Commission has exclusive competence to supervise VLOPs and VLOSEs for enhanced obligations under Chapter IV. This means Meta faces Commission supervision for risk assessments, audits, and VLOP-specific requirements, but faces Irish DSC supervision for baseline obligations like notice-and-action. Providers must therefore maintain relationships with multiple authorities: their establishment-state DSC as primary supervisor, DSCs in other Member States where they operate, the Commission if designated as VLOP/VLOSE, and potentially other competent authorities (e.g., data protection authorities for GDPR, consumer protection authorities, competition authorities). Practical coordination includes: establishing clear authority contact channels, designating internal teams responsible for each authority relationship, implementing systems to track and respond to orders from multiple authorities, ensuring legal representatives can coordinate responses, and participating in DSC coordination mechanisms to ensure consistent compliance across the EU.