Article 86

Representation

Understanding This Article

Article 86 enables collective representation addressing practical reality that individual users often lack resources, technical expertise, legal knowledge, or sustained motivation to effectively enforce DSA rights against sophisticated well-resourced platforms. Individual users face asymmetric power dynamics: platforms employ extensive legal teams, technical experts, and compliance professionals, while individual users typically lack comparable expertise and resources. Individual complaints may be ignored or dismissed with perfunctory responses. Individual users may not recognize when platform conduct violates DSA requirements. Individual users may fear retaliation or account termination if they assert rights too aggressively. Article 86 addresses these challenges by enabling qualified representative bodies—consumer protection organizations like BEUC (European Consumer Organisation), digital rights groups like EDRi (European Digital Rights), national consumer associations, industry-specific advocacy organizations—to exercise DSA rights on behalf of multiple users, aggregating individual grievances into systematic compliance complaints carrying greater weight and regulatory impact. Three qualifying criteria ensure representative legitimacy: (1) Not-for-profit status prevents commercial motivations—representatives serve public interest rather than private profit. (2) Proper constitution under Member State law demonstrates organizational stability, legal accountability, and regulatory oversight. (3) Statutory objectives including legitimate interest in DSA compliance ensure mission alignment—organizations must have regulatory compliance as core purpose documented in founding documents. Platforms must prioritize representative complaints recognizing systematic importance: representative submission potentially affects thousands or millions of users beyond individual complainant, suggests systematic compliance failures requiring comprehensive remediation, comes from organizations with expertise to identify and document complex violations, carries reputational and regulatory escalation risk if ignored. Priority processing means expedited review, substantive assessment by senior compliance personnel rather than frontline support staff, comprehensive response addressing systemic issues, and serious consideration of remediation proposals.

Key Points

  • Service recipients may mandate qualified bodies to exercise DSA rights on their behalf
  • Qualifying criteria: (a) operates not-for-profit, (b) properly constituted under Member State law, (c) statutory objectives include legitimate interest in DSA compliance
  • Platforms must implement priority processing for representative entity complaints without undue delay
  • Preserves Directive 2020/1828 representative action rights enabling court-based collective redress
  • Enables collective enforcement through consumer protection and civil society organizations
  • Addresses individual users' resource, expertise, and motivation constraints in exercising DSA rights
  • Representative bodies can aggregate individual grievances into systematic compliance complaints
  • Not-for-profit requirement prevents commercially-motivated representation
  • Proper constitution requirement demonstrates organizational stability and legitimacy
  • Statutory objectives requirement ensures mission alignment with regulatory compliance
  • Priority processing recognizes systematic importance of representative complaints potentially affecting many users
  • Organizations like BEUC, EDRi, national consumer associations qualify as representative bodies
  • Complements individual complaint mechanisms with collective representation pathways
  • Representative complaints often better documented and more comprehensive than individual submissions
  • Platforms incentivized to address systematic issues identified by representative bodies rather than addressing complaints individually
  • Creates accountability pathway through organized civil society oversight

Practical Application

Civil Society Systematic Enforcement: European Consumer Organisation (BEUC) receives 500+ individual complaints from users across 15 EU Member States about YouTube (designated VLOP) employing dark patterns violating Article 25. Complainants report: default settings automatically enabling personalized advertising without clear consent mechanisms; interface design using visual manipulation to steer users toward privacy-invasive options; complex multi-step processes required to disable data collection while single-click enabling available; misleading button labeling suggesting privacy-protective options are unavailable or inferior. Individual users attempted to complain directly to YouTube but received perfunctory automated responses or no response. Individual complaints lack detailed technical documentation and legal analysis. Rather than each user filing separate complaints (resource-intensive, potentially ignored), BEUC exercises Article 86 collective representation rights. BEUC compiles comprehensive systematic complaint documenting: 500+ individual user reports organized by dark pattern category and jurisdiction; technical analysis by UX designers and behavioral psychologists explaining how interface manipulations exploit cognitive biases; legal analysis demonstrating violations of Article 25's prohibitions on deceiving or nudging users toward choices detrimental to their interests; comparative analysis showing YouTube's practices diverge from other platforms' more transparent consent mechanisms; quantitative impact assessment estimating millions of EU users affected. BEUC files complaint with YouTube and Irish DSC (YouTube's establishment coordinator). Under Article 86 paragraph 2, YouTube must process BEUC's complaint with priority and without undue delay. YouTube's response: assign complaint to senior Legal Affairs and Trust & Safety executives (not frontline support); conduct comprehensive internal review of consent flows and interface design; engage with BEUC representatives in substantive discussions about concerns and potential remediations; respond within 30 days (vs. 60-90 day typical timeframes for individual complaints) with detailed explanation of investigation findings and proposed remedial measures; implement interface modifications addressing identified dark patterns; report remediation completion to BEUC and Irish DSC. Irish DSC treats BEUC complaint seriously given organization's expertise and representative capacity. DSC initiates own investigation, references BEUC's technical and legal analyses, confirms dark pattern violations, issues Article 52 enforcement decision requiring YouTube to maintain remediated interfaces and pay administrative fines. Commission takes notice of BEUC complaint when considering broader Article 35 investigation of YouTube's systematic risk mitigation. Results: broad remediation benefiting all EU YouTube users, not just 500 original complainants; systematic compliance improvement rather than individualized complaint resolution; validation of Article 86 collective representation as effective enforcement pathway. Demonstrates value: individual users with limited resources leveraged BEUC's expertise and organizational capacity; BEUC aggregated individual complaints into systematic documentation carrying regulatory weight; YouTube incentivized to address systemic issues comprehensively; regulators benefited from civil society investigation and analysis.