Content Moderation Under the DSA
Content moderation refers to the practices, actions, and processes that providers use to identify and address illegal content or content that violates their terms and conditions.
Statement of Reasons
When providers remove content, suspend accounts, or otherwise restrict information, they must provide a clear statement of reasons including:
- Whether action was taken based on illegal content or terms violation
- Facts and circumstances relied on
- Information about redress mechanisms available
- Where applicable, information about use of automated tools
Internal Complaint System
Online platforms must provide internal complaint-handling systems allowing users to contest moderation decisions. Systems must:
- Be easily accessible and user-friendly
- Enable electronic submission of complaints
- Process complaints in a timely, non-discriminatory manner
- Be subject to human review
- Provide decisions within a reasonable timeframe
Out-of-Court Dispute Settlement
Users can engage certified out-of-court dispute settlement bodies to resolve disputes with platforms about content moderation decisions.
Safeguards Against Misuse
Platforms may take measures against users who frequently provide manifestly illegal content or submit manifestly unfounded notices or complaints, including temporary suspensions after prior warning.