Chapter 3|Additional Obligations for Very Large Platforms|📖 9 min read
1. The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
2. Where significant systemic risk within the meaning of Article 34(1) emerges and concerns several very large online platforms or very large online search engines, the Commission may invite the providers of those services, as well as civil society organisations and other relevant parties to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
3. The Commission and the Board shall aim to ensure that codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and the Board on any measures taken and their outcomes, as measured against the key performance indicators that they contain.
4. The Commission and the Board shall assess whether codes of conduct meet the aims specified in paragraphs 1, 2 and 3 and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. The Commission and the Board shall encourage regular review of the codes and, where appropriate, may invite the signatories to amend the codes in the light of those conclusions.
Understanding This Article
Article 45 establishes codes of conduct as key co-regulatory mechanism in DSA implementation, bridging voluntary industry self-regulation and binding legal obligations. While formally voluntary, codes create de facto binding obligations for largest platforms through multiple accountability mechanisms, representing sophisticated regulatory approach balancing flexibility, industry expertise, and regulatory authority.
The Commission and Board's role (paragraph 1) is facilitative not mandatory: they 'may encourage and facilitate' code development when beneficial. This contrasts with Articles 46-47's more directive language ('shall encourage and facilitate') for specific code areas. Article 45's permissive language allows discretionary code development for emerging issues not specifically identified in DSA but where industry coordination would benefit compliance. Facilitation includes: convening stakeholders (platforms, civil society, experts, Member States), providing secretariat support for code development processes, offering technical expertise and analysis, publishing codes and monitoring frameworks, promoting code adoption through guidance and public pressure.
Codes address 'proper application of this Regulation' - any DSA compliance area where industry-developed standards, best practices, or procedures would facilitate implementation. This broad scope enables codes covering: content moderation methodologies (balancing illegal content removal with free expression), systemic risk assessment approaches (identifying and quantifying risks), mitigation measure effectiveness standards (what constitutes adequate risk mitigation?), crisis response procedures (coordinating during emergencies), transparency reporting formats (standardizing data collection and reporting), advertising practices (enhancing transparency beyond minimum requirements), child safety measures (protecting minors from harm), accessibility implementation (serving users with disabilities). Codes complement law by providing detailed implementation guidance where legislation sets principles but leaves methodology to platforms.
The 'voluntary' characterization masks significant practical binding force created through multiple mechanisms. First, Article 35(1)(h) on systemic risk mitigation explicitly states compliance with relevant Article 45 codes is among ways VLOPs/VLOSEs may comply with risk mitigation obligations. While phrased permissively ('may'), platforms struggling to demonstrate adequate mitigation face pressure: if code exists establishing industry standard approaches, platform not participating or not complying risks being found inadequate. Second, Recital 104 warns that refusing Commission invitation to participate in code 'could be taken into account' when determining DSA infringement. This creates reputational and regulatory pressure: refusal signals non-cooperation, potentially influencing enforcement decisions. Third, Article 37 audit requirements: auditors must assess compliance with code commitments where platforms participate. Code participation triggers audit scrutiny, creating accountability for commitments. Fourth, public and civil society pressure: codes negotiated with civil society input create public expectations; platforms failing to meet code commitments face criticism, campaigns, media scrutiny even if legally voluntary. Fifth, competitive dynamics: if major platforms participate in code setting industry standards, non-participants risk appearing less responsible, losing competitive positioning on trust and safety.
This multi-layered accountability creates de facto binding nature while preserving legal fiction of voluntariness. Platforms theoretically can refuse participation or non-compliance, but practical consequences make this difficult for VLOPs facing intense scrutiny. The regulatory model resembles 'enforced self-regulation': industry develops standards under regulatory facilitation and supervision, with regulatory consequences for non-participation or non-compliance creating enforcement pressure without formal legal obligation. This balances industry expertise (codes leverage platform knowledge of technical implementation) with accountability (regulatory oversight and consequences prevent purely cosmetic self-regulation).
Codes offer regulatory advantages over binding legislation: Speed - codes can be developed and updated faster than legislative processes, responding to emerging issues rapidly; Flexibility - codes can be tailored to different platform types, sizes, risk profiles; Detail - codes can provide technical specificity impractical in legislation; Evolution - codes can be revised as technology and risks evolve without legislative amendments; Legitimacy - multi-stakeholder development incorporating civil society, experts, and industry creates broader buy-in than top-down regulation; Expertise - leverages industry technical knowledge and operational experience. However, codes risk: Capture - industry dominance in code development could lead to weak, industry-friendly standards; Opacity - code negotiation processes may lack transparency compared to legislative procedures; Enforceability - voluntary nature complicates enforcement if participants backslide; Fragmentation - multiple codes across jurisdictions could create compliance complexity; Lowest common denominator - need for consensus may produce weak standards avoiding controversial requirements.
Article 45 codes complement Articles 46-47's specific codes (online advertising, accessibility) and Article 48's crisis protocols, creating comprehensive co-regulatory framework addressing both specific identified areas and emerging issues requiring flexibility.
Key Points
Commission and Board may encourage and facilitate drawing up of voluntary codes of conduct at Union level
Codes address proper DSA application across various areas including content moderation, systemic risk mitigation, crisis response
Formally voluntary but Article 35(1)(h) suggests compliance with relevant codes as risk mitigation means for VLOPs
Recital 104 warns refusal to participate in Commission-invited codes may be considered when determining DSA infringements
Codes enable flexible, adaptive approaches to emerging challenges faster than legislative updates
Major codes include Code of Practice on Disinformation (integrated July 2025) and Code of Conduct on Hate Speech
Code participants include platforms, advertisers, civil society, fact-checkers, and researchers
Article 37 audits must assess compliance with code commitments where platforms participate
Bridges self-regulation and binding law, enabling industry expertise while maintaining regulatory oversight
Allows rapid response to emerging issues (AI-generated content, new platform types, novel risks) through code updates
Practical Application
For Commission and Board (Facilitating Code Development): Commission identifies code needs through regulatory experience: recurring compliance challenges, emerging risks lacking clear mitigation approaches, areas where industry coordination would benefit. Process: (1) Stakeholder convening: Commission invites participants including platforms (VLOPs and smaller platforms), civil society organizations, subject matter experts, relevant industry associations, Member State representatives as observers. Broad participation ensures diverse perspectives preventing industry capture. (2) Facilitation: Commission provides secretariat support (organizing meetings, drafting documents, facilitating discussions), technical expertise (research on effective practices, international comparisons, impact analysis), regulatory guidance (clarifying how code relates to DSA obligations, what commitments might satisfy requirements). (3) Negotiation and drafting: Stakeholders negotiate code content through iterative process: identify specific commitments (concrete, measurable actions participants will take), define metrics and measurement methodologies, establish transparency and reporting requirements, create accountability mechanisms (audits, public reporting, dispute resolution), specify participation conditions and withdrawal procedures. (4) Adoption and publication: Once consensus reached, participants formally sign code, Commission publishes code and supporting documentation, Commission and Board promote adoption encouraging broader participation. (5) Monitoring and evolution: Establish monitoring framework tracking code implementation, collect annual reporting from participants, facilitate revisions addressing implementation challenges or evolving circumstances, assess code effectiveness evaluating whether objectives achieved. Example: Code of Practice on Disinformation development involved extensive multi-stakeholder process over years, producing detailed commitments on: demonetizing disinformation, transparency in political advertising, empowering users through content labeling, empowering fact-checking community through partnerships, data access for researchers, reducing manipulative behaviors through account authenticity measures. Commission facilitation enabled agreement despite competing interests, ongoing monitoring holds participants accountable.
For Platforms (Strategic Code Participation): Platforms must decide whether to participate in codes and how to implement commitments. Strategic considerations: (1) Participation decision: Benefits of participation include regulatory alignment (demonstrating compliance effort satisfying Article 35 risk mitigation expectations), competitive positioning (participating platforms seen as more responsible), influence on standards (participants shape code content affecting entire industry), regulatory goodwill (cooperation builds positive relationship with Commission and DSCs). Risks include binding commitments (code creates specific obligations), audit scrutiny (compliance assessed in Article 37 audits), reputational damage if non-compliance (public failure to meet commitments). Generally, VLOPs should participate in relevant codes given de facto pressure and benefits, while smaller platforms may selectively participate where relevant to their services. (2) Implementation: Participating platforms must operationalize commitments through: assigning responsibility (designating teams responsible for each commitment), allocating resources (engineering, policy, moderation, measurement), implementing measures (building systems, training moderators, establishing procedures), measuring compliance (tracking metrics, documenting actions, preparing reports), reporting (publishing transparency reports, participating in monitoring frameworks). (3) Continuous improvement: Codes typically require continuous improvement not just minimal compliance. Platforms should benchmark against peers, adopt evolving best practices, invest in innovation addressing code objectives. (4) Stakeholder engagement: Code implementation often involves external stakeholders (fact-checkers, civil society, researchers). Platforms should maintain partnerships, incorporate feedback, demonstrate responsiveness to stakeholder concerns. (5) Documentation for audits: Article 37 auditors will assess code compliance. Platforms should maintain comprehensive documentation: commitment-by-commitment implementation records, metrics and measurement methodologies, evidence of actions taken, explanations of any shortfalls or delays, improvement plans addressing identified gaps. Example: Meta's implementation of Disinformation Code requires: demonetization policies blocking ad revenue to disinformation publishers, advertising transparency through Ad Library, fact-checking partnerships in multiple languages, researcher data access programs, label systems for manipulated media, reducing distribution of borderline content, reporting on enforcement actions and metrics - each requiring substantial operational changes, ongoing investments, continuous monitoring and reporting.
For Civil Society (Leveraging Codes for Accountability): Civil society organizations participate in code development and use codes for ongoing platform accountability. (1) Code development participation: Civil society should actively engage in code negotiations bringing user perspectives, demanding strong commitments addressing real harms, resisting industry attempts to weaken provisions, ensuring meaningful transparency and accountability mechanisms, securing civil society role in monitoring. Effective participation requires technical expertise, sustained engagement, coordination among organizations. (2) Monitoring compliance: Once codes adopted, civil society monitors platform implementation through: analyzing platforms' transparency reports, conducting independent assessments of commitments, documenting failures to implement or shortfalls, raising concerns with Commission and Board, publishing scorecards or evaluations, media campaigning on non-compliance. This creates public accountability pressure complementing regulatory oversight. (3) Providing feedback: Civil society should provide constructive feedback on code implementation identifying: effective practices worth expanding, implementation gaps requiring attention, needed revisions to code reflecting experience, emerging issues requiring new commitments. This feedback informs code evolution ensuring continued relevance. (4) Research and analysis: Academic and advocacy researchers should study code effectiveness examining: whether commitments being implemented, whether implementation achieving intended outcomes (reducing disinformation spread, protecting minors, etc.), unintended consequences or gaps, comparative effectiveness across platforms. Research builds evidence base for code assessment and revision. Example: Civil society organizations monitoring Disinformation Code documented implementation gaps (limited fact-checking coverage in smaller languages, weak enforcement of manipulative behavior policies), provided feedback improving subsequent code iterations, conducted research demonstrating code's limited effectiveness in some areas while recognizing progress in others.