Systemic Risk Assessments

Requirements for very large platforms and search engines to assess and mitigate systemic risks arising from their services.

Systemic Risk Assessment and Mitigation

Very large online platforms and search engines must identify, analyze, and mitigate systemic risks their services may pose.

Categories of Systemic Risks

Providers must assess risks relating to:

  • Illegal Content Dissemination: Amplification and spread of illegal content
  • Fundamental Rights: Adverse effects on freedom of expression, privacy, non-discrimination, rights of the child
  • Electoral Processes: Manipulation or interference with democratic processes
  • Civic Discourse: Polarization, disinformation campaigns
  • Gender-Based Violence: Facilitation or amplification of violence against women
  • Mental Health: Impacts on minors' physical and mental development
  • Public Health and Security: Threats to public health, safety, or security

Assessment Requirements

Risk assessments must be conducted:

  • At least annually
  • Before deploying new functionalities with significant impact
  • Based on all available evidence
  • Taking into account how design, algorithms, and terms and conditions may contribute to risks

Risk Mitigation Measures

Platforms must implement reasonable, proportionate, and effective mitigation measures such as:

  • Adapting content moderation or recommender systems
  • Adapting terms and conditions
  • Adapting decision-making processes
  • Deploying technical solutions
  • Increasing resources for content moderation
  • Conducting targeted communication campaigns

Oversight

Risk assessments and mitigation plans are subject to independent auditing and Commission review.