Chapter 3

Due Diligence Obligations for a Transparent and Safe Online Environment

Sets out obligations for all intermediary services, with additional requirements for hosting services, online platforms, and very large online platforms to ensure transparency, accountability, and user protection.

Overview

Chapter III: Due Diligence Obligations

This is the core operational chapter of the DSA, establishing a tiered system of obligations based on the type and size of service.

Section 1: Provisions Applicable to All Intermediary Services (Articles 10-15):

  • Contact points for authorities, users, and trusted flaggers
  • Legal representatives for non-EU providers
  • Clear, accessible terms and conditions
  • Annual transparency reports on content moderation

Section 2: Additional Provisions for Hosting Services (Articles 16-18):

  • Notice and action mechanisms for reporting illegal content
  • Statement of reasons when removing content or suspending accounts
  • Notification to law enforcement of suspected serious crimes

Section 3: Additional Provisions for Online Platforms (Articles 19-32):

  • Internal complaint-handling systems
  • Out-of-court dispute settlement
  • Trusted flagger status and priority treatment
  • Measures against misuse and repeat offenders
  • Additional transparency reporting
  • Advertising transparency requirements
  • Recommender system transparency and user choice
  • Protection of minors from targeted advertising
  • Online interface design prohibiting dark patterns

Section 4: Online Platforms Allowing Distance Contracts with Traders (Articles 33-37):

  • Trader traceability (KYBC - Know Your Business Customer)
  • Compliance by design for trader obligations
  • Direct order to act against illegal products

Section 5: Very Large Online Platforms and Search Engines (Articles 38-45):

  • Systemic risk assessments (illegal content, fundamental rights impacts, electoral integrity, civic discourse, gender-based violence, minors' protection, mental health)
  • Risk mitigation measures
  • Crisis response mechanism
  • Independent compliance auditing
  • Additional transparency requirements
  • Access to data for researchers and authorities
  • Compliance function
  • Supervisory fees

Section 6: Codes of Conduct and Crisis Protocols (Articles 46-52):

  • Voluntary codes of conduct to apply DSA provisions
  • Specific codes to protect minors online
  • Crisis protocols for public security and safety emergencies

Articles in This Chapter

  • Article 11: Points of contact for Member State authorities, the Commission and the Board

    Article 11 of DSA Chapter III Section 1 - requires providers to designate points of contact for authorities

  • Article 12: Points of contact for recipients of the service

    Article 12 of DSA Chapter III Section 1 - requires providers to enable users to contact them directly

  • Article 13: Legal representatives

    Article 13 of DSA Chapter III Section 1 - requires non-EU providers to designate legal representatives in the Union

  • Article 14: Terms and conditions

    Article 14 of DSA Chapter III Section 1 - requires providers to have clear, accessible terms and conditions including content moderation policies

  • Article 15: Transparency reporting obligations for providers of intermediary services

    Article 15 of DSA Chapter III Section 1 - requires all intermediary services to publish annual transparency reports

  • Article 16: Notice and action mechanisms

    Article 16 of DSA Chapter III Section 2 - THE CRITICAL provision establishing standardized mechanisms for users to report illegal content to hosting providers

  • Article 17: Statement of reasons

    Article 17 of DSA Chapter III Section 2 - requires hosting providers to explain content moderation decisions to affected users

  • Article 18: Notification of suspicions of criminal offences

    Article 18 of DSA Chapter III Section 2 - requires hosting providers to notify law enforcement when they suspect serious criminal offences threatening life or safety

  • Article 19: Exclusion for micro and small enterprises

    Article 19 of DSA Chapter III Section 3 - exempts micro and small platforms from most platform-specific obligations to reduce compliance burden

  • Article 20: Internal complaint-handling system

    Article 20 of DSA Chapter III Section 3 - THE CRITICAL provision requiring platforms to provide appeals mechanisms for content moderation decisions

  • Article 21: Out-of-court dispute settlement

    Article 21 of DSA Chapter III Section 3 - establishes independent, certified bodies to resolve disputes between users and platforms when internal appeals fail

  • Article 22: Trusted flaggers

    Article 22 of DSA Chapter III Section 3 - creates status for expert entities whose illegal content reports receive priority processing by platforms

  • Article 23: Measures against misuse

    Article 23 of DSA Chapter III Section 3 - authorizes platforms to suspend abusive users who repeatedly post illegal content or file frivolous complaints

  • Article 24: Additional transparency reporting obligations for providers of online platforms

    Article 24 of DSA Chapter III Section 3 - requires platforms to report user numbers, dispute statistics, and content moderation decisions to enable VLOP designation and public accountability

  • Article 25: Online interface design and organisation

    Article 25 of DSA Chapter III Section 3 - THE anti-dark patterns provision prohibiting deceptive or manipulative interface design that impairs users' free and informed decisions

  • Article 26: Advertising transparency

    Article 26 of DSA Chapter III Section 3 - requires platforms to clearly label ads, identify advertisers and payers, explain targeting parameters, and prohibits profiling using sensitive personal data

  • Article 27: Recommender systems

    Article 27 of DSA Chapter III Section 3 - requires platforms to disclose recommendation algorithm parameters, explain their significance, and provide users with alternative non-personalized options

  • Article 28: Online protection of minors

    Article 28 of DSA Chapter III Section 3 - requires platforms accessible to minors to implement appropriate privacy, safety, and security measures, and prohibits targeted advertising to minors

  • Article 29: Exclusion for micro and small enterprises

    Article 29 of DSA Chapter III Section 4 - exempts micro and small marketplace platforms from trader verification obligations unless designated as VLOPs

  • Article 30: Traceability of traders

    Article 30 of DSA Chapter III Section 4 - requires marketplaces to verify trader identities and information before allowing sales, the 'Know Your Business Customer' provision

  • Article 31: Compliance by design for online marketplaces

    Article 31 of DSA Chapter III Section 4 - requires marketplaces to design systems helping traders identify illegal products and comply with EU product safety and consumer protection laws

  • Article 32: Database for marketplace compliance information

    Article 32 of DSA Chapter III Section 4 - requires Commission to establish EU-wide database where marketplaces submit trader verification information for cross-platform enforcement

  • Article 33: Very large online platforms and very large online search engines

    Article 33 of DSA Chapter III Section 5 - defines VLOPs/VLOSEs as platforms with 45+ million monthly EU users, triggering heightened obligations in Articles 34-43

  • Article 34: Risk assessment

    Article 34 of DSA Chapter III Section 5 - requires VLOPs/VLOSEs to conduct annual systemic risk assessments analyzing impacts on illegal content, fundamental rights, elections, public health, and minors

  • Article 35: Risk mitigation

    Article 35 of DSA Chapter III Section 5 - requires VLOPs/VLOSEs to implement reasonable, proportionate, effective mitigation measures addressing systemic risks identified in Article 34 assessments

  • Article 36: Independent audit

    Article 36 of DSA Chapter III Section 5 - requires VLOPs/VLOSEs to undergo annual independent audits verifying compliance with Articles 34-35 and other DSA obligations

  • Article 37: Audit report

    Article 37 of DSA Chapter III Section 5 - requires audit reports to be submitted to platform's Digital Services Coordinator and Commission, with public summary publication

  • Article 38: Recommendations issued by the Commission

    Article 38 of DSA Chapter III Section 5 - empowers Commission to issue non-binding recommendations to VLOPs/VLOSEs on applying DSA obligations based on audit findings and risk assessments

  • Article 39: Online interface design and organisation

    Article 39 of DSA Chapter III Section 5 - prohibits VLOPs/VLOSEs from designing, organizing or operating online interfaces in ways that deceive, manipulate users, or materially distort user autonomy and choice (anti-dark patterns provision)

  • Article 40: Mitigation of risks arising from the design of recommender systems

    Article 40 of DSA Chapter III Section 5 - requires VLOPs/VLOSEs using recommender systems to provide at least one option not based on profiling, accessible through user-friendly interface with clear information

  • Article 41: Advertising transparency

    Article 41 of DSA Chapter III Section 5 requires VLOPs/VLOSEs to establish and maintain comprehensive, publicly accessible, searchable advertising repositories containing all advertisements displayed on their platforms, with detailed metadata retained for at least one year, enabling unprecedented public scrutiny of online advertising ecosystems.

  • Article 42: Data access and scrutiny

    Article 42 of DSA Chapter III Section 5 establishes framework for VLOPs/VLOSEs to provide data access to regulators and vetted researchers, enabling independent scrutiny of algorithmic systems, content moderation practices, and systemic risks that were previously opaque, transforming platform oversight from trust-based to evidence-based.

  • Article 43: Supervisory fee

    Article 43 of DSA Chapter III Section 5 establishes annual supervisory fees charged by the Commission to VLOPs/VLOSEs to cover EU-level supervision costs, calculated proportionate to platform size (average monthly active recipients) with cap at 0.05% of annual worldwide net income, creating sustainable funding for regulatory oversight while facing legal challenges regarding calculation methodology.

  • Article 44: Standards

    Article 44 of DSA Chapter III Section 5 requires the Commission to support development and implementation of voluntary European and international standards covering key DSA compliance areas including electronic submissions, user communications, APIs, auditing, and advertisement repository interoperability, facilitating standardized technical implementation.

  • Article 45: Codes of conduct

    Article 45 of DSA Chapter III Section 5 encourages Commission and Board to facilitate voluntary codes of conduct addressing DSA compliance areas, creating flexible co-regulatory mechanisms that allow industry self-regulation within legal framework, though with de facto binding effects for VLOPs through risk mitigation obligations.

  • Article 46: Codes of conduct for online advertising

    Article 46 of DSA Chapter III Section 5 requires Commission to encourage and facilitate voluntary codes of conduct addressing advertising transparency across the online advertising value chain, with mandatory deadlines of February 18, 2025 for development and August 18, 2025 for application.

  • Article 47: Codes of conduct for accessibility

    Article 47 of DSA Chapter III Section 5 requires Commission to encourage voluntary codes of conduct promoting accessibility for persons with disabilities in online platforms, with February 18, 2025 development deadline and August 18, 2025 application deadline, though criticized for making accessibility voluntary rather than mandatory.

  • Article 48: Crisis protocols

    Article 48 of DSA Chapter III Section 5 enables Board to recommend voluntary crisis protocols for extraordinary circumstances affecting public security or public health, allowing coordinated platform responses during emergencies while raising concerns about potential restrictions on freedom of expression and information access.