-
Article 11 of DSA Chapter III Section 1 - requires providers to designate points of contact for authorities
-
Article 12 of DSA Chapter III Section 1 - requires providers to enable users to contact them directly
-
Article 13 of DSA Chapter III Section 1 - requires non-EU providers to designate legal representatives in the Union
-
Article 14 of DSA Chapter III Section 1 - requires providers to have clear, accessible terms and conditions including content moderation policies
-
Article 15 of DSA Chapter III Section 1 - requires all intermediary services to publish annual transparency reports
-
Article 16 of DSA Chapter III Section 2 - THE CRITICAL provision establishing standardized mechanisms for users to report illegal content to hosting providers
-
Article 17 of DSA Chapter III Section 2 - requires hosting providers to explain content moderation decisions to affected users
-
Article 18 of DSA Chapter III Section 2 - requires hosting providers to notify law enforcement when they suspect serious criminal offences threatening life or safety
-
Article 19 of DSA Chapter III Section 3 - exempts micro and small platforms from most platform-specific obligations to reduce compliance burden
-
Article 20 of DSA Chapter III Section 3 - THE CRITICAL provision requiring platforms to provide appeals mechanisms for content moderation decisions
-
Article 21 of DSA Chapter III Section 3 - establishes independent, certified bodies to resolve disputes between users and platforms when internal appeals fail
-
Article 22 of DSA Chapter III Section 3 - creates status for expert entities whose illegal content reports receive priority processing by platforms
-
Article 23 of DSA Chapter III Section 3 - authorizes platforms to suspend abusive users who repeatedly post illegal content or file frivolous complaints
-
Article 24 of DSA Chapter III Section 3 - requires platforms to report user numbers, dispute statistics, and content moderation decisions to enable VLOP designation and public accountability
-
Article 25 of DSA Chapter III Section 3 - THE anti-dark patterns provision prohibiting deceptive or manipulative interface design that impairs users' free and informed decisions
-
Article 26 of DSA Chapter III Section 3 - requires platforms to clearly label ads, identify advertisers and payers, explain targeting parameters, and prohibits profiling using sensitive personal data
-
Article 27 of DSA Chapter III Section 3 - requires platforms to disclose recommendation algorithm parameters, explain their significance, and provide users with alternative non-personalized options
-
Article 28 of DSA Chapter III Section 3 - requires platforms accessible to minors to implement appropriate privacy, safety, and security measures, and prohibits targeted advertising to minors
-
Article 29 of DSA Chapter III Section 4 - exempts micro and small marketplace platforms from trader verification obligations unless designated as VLOPs
-
Article 30 of DSA Chapter III Section 4 - requires marketplaces to verify trader identities and information before allowing sales, the 'Know Your Business Customer' provision
-
Article 31 of DSA Chapter III Section 4 - requires marketplaces to design systems helping traders identify illegal products and comply with EU product safety and consumer protection laws
-
Article 32 of DSA Chapter III Section 4 - requires Commission to establish EU-wide database where marketplaces submit trader verification information for cross-platform enforcement
-
Article 33 of DSA Chapter III Section 5 - defines VLOPs/VLOSEs as platforms with 45+ million monthly EU users, triggering heightened obligations in Articles 34-43
-
Article 34 of DSA Chapter III Section 5 - requires VLOPs/VLOSEs to conduct annual systemic risk assessments analyzing impacts on illegal content, fundamental rights, elections, public health, and minors
-
Article 35 of DSA Chapter III Section 5 - requires VLOPs/VLOSEs to implement reasonable, proportionate, effective mitigation measures addressing systemic risks identified in Article 34 assessments
-
Article 36 of DSA Chapter III Section 5 - requires VLOPs/VLOSEs to undergo annual independent audits verifying compliance with Articles 34-35 and other DSA obligations
-
Article 37 of DSA Chapter III Section 5 - requires audit reports to be submitted to platform's Digital Services Coordinator and Commission, with public summary publication
-
Article 38 of DSA Chapter III Section 5 - empowers Commission to issue non-binding recommendations to VLOPs/VLOSEs on applying DSA obligations based on audit findings and risk assessments
-
Article 39 of DSA Chapter III Section 5 - prohibits VLOPs/VLOSEs from designing, organizing or operating online interfaces in ways that deceive, manipulate users, or materially distort user autonomy and choice (anti-dark patterns provision)
-
Article 40 of DSA Chapter III Section 5 - requires VLOPs/VLOSEs using recommender systems to provide at least one option not based on profiling, accessible through user-friendly interface with clear information
-
Article 41 of DSA Chapter III Section 5 requires VLOPs/VLOSEs to establish and maintain comprehensive, publicly accessible, searchable advertising repositories containing all advertisements displayed on their platforms, with detailed metadata retained for at least one year, enabling unprecedented public scrutiny of online advertising ecosystems.
-
Article 42 of DSA Chapter III Section 5 establishes framework for VLOPs/VLOSEs to provide data access to regulators and vetted researchers, enabling independent scrutiny of algorithmic systems, content moderation practices, and systemic risks that were previously opaque, transforming platform oversight from trust-based to evidence-based.
-
Article 43 of DSA Chapter III Section 5 establishes annual supervisory fees charged by the Commission to VLOPs/VLOSEs to cover EU-level supervision costs, calculated proportionate to platform size (average monthly active recipients) with cap at 0.05% of annual worldwide net income, creating sustainable funding for regulatory oversight while facing legal challenges regarding calculation methodology.
-
Article 44 of DSA Chapter III Section 5 requires the Commission to support development and implementation of voluntary European and international standards covering key DSA compliance areas including electronic submissions, user communications, APIs, auditing, and advertisement repository interoperability, facilitating standardized technical implementation.
-
Article 45 of DSA Chapter III Section 5 encourages Commission and Board to facilitate voluntary codes of conduct addressing DSA compliance areas, creating flexible co-regulatory mechanisms that allow industry self-regulation within legal framework, though with de facto binding effects for VLOPs through risk mitigation obligations.
-
Article 46 of DSA Chapter III Section 5 requires Commission to encourage and facilitate voluntary codes of conduct addressing advertising transparency across the online advertising value chain, with mandatory deadlines of February 18, 2025 for development and August 18, 2025 for application.
-
Article 47 of DSA Chapter III Section 5 requires Commission to encourage voluntary codes of conduct promoting accessibility for persons with disabilities in online platforms, with February 18, 2025 development deadline and August 18, 2025 application deadline, though criticized for making accessibility voluntary rather than mandatory.
-
Article 48 of DSA Chapter III Section 5 enables Board to recommend voluntary crisis protocols for extraordinary circumstances affecting public security or public health, allowing coordinated platform responses during emergencies while raising concerns about potential restrictions on freedom of expression and information access.