Frequently Asked Questions (FAQ)
Common questions about the Digital Services Act, platform obligations, and user rights.
General Questions
What is the Digital Services Act (DSA)?
The Digital Services Act (Regulation (EU) 2022/2065) is a comprehensive EU regulation that establishes harmonized rules for intermediary services and online platforms operating in the European Union. Adopted on October 19, 2022, and effective from February 17, 2024, it creates a safer online environment by:
- Establishing liability rules for intermediary services
- Setting due diligence obligations for platforms
- Requiring transparency in content moderation and advertising
- Imposing enhanced obligations on very large platforms
- Protecting users' rights and fundamental freedoms online
Who does the DSA apply to?
The DSA applies to all intermediary services offered to recipients in the EU, regardless of where the provider is established:
- Mere Conduit Services: ISPs, network providers
- Caching Services: Services that store copies of data temporarily
- Hosting Services: Cloud storage, web hosting
- Online Platforms: Social media, marketplaces, app stores, content-sharing platforms
- Online Search Engines: Services that index and retrieve web content
The regulation creates a tiered system with more obligations for larger and more impactful services.
When did the DSA come into effect?
The DSA has staggered application dates:
- November 16, 2022: Entry into force
- August 25, 2023: Application for Very Large Online Platforms and Search Engines (4 months after first designations)
- February 17, 2024: Full application for all other providers
How is the DSA different from the GDPR?
While both are EU regulations, they have different focuses:
- DSA: Regulates intermediary services and online platforms, focusing on content moderation, illegal content, platform obligations, and transparency
- GDPR: Regulates personal data processing and privacy protection
The two regulations are complementary. DSA obligations must be implemented in compliance with GDPR requirements where personal data is involved.
Platform Obligations
What are the main obligations for all intermediary services?
All intermediary services must:
- Designate points of contact for authorities and users
- Designate a legal representative in the EU (if not established in EU)
- Publish clear and accessible terms and conditions
- Publish annual transparency reports on content moderation activities
- Comply with orders from authorities to act against illegal content
What additional obligations apply to hosting services?
Hosting services must additionally:
- Provide notice and action mechanisms for reporting illegal content
- Provide statements of reasons when removing content or suspending accounts
- Notify law enforcement of suspected serious crimes (e.g., child abuse, terrorist threats)
What additional obligations apply to online platforms?
Online platforms have numerous additional obligations including:
- Internal complaint-handling systems for users
- Engagement with out-of-court dispute settlement bodies
- Priority treatment for trusted flagger notices
- Measures against misuse (repeat offenders)
- Transparency about advertising (who paid, why you see it)
- Transparency about recommender systems
- Protection of minors from targeted advertising
- Prohibition of dark patterns
Marketplaces have additional trader verification ("Know Your Business Customer") obligations.
What are Very Large Online Platforms (VLOPs) and what extra obligations do they have?
VLOPs are platforms or search engines with 45 million or more average monthly active users in the EU. They face enhanced obligations:
- Annual systemic risk assessments (illegal content, fundamental rights, election integrity, etc.)
- Implementation of risk mitigation measures
- Crisis response mechanisms for public emergencies
- Independent annual compliance audits
- Provide users with non-profiling recommendation options
- Maintain public advertisement repositories
- Provide researcher access to publicly available data
- Appoint independent compliance officers
- Pay supervisory fees
- Subject to direct Commission supervision
Illegal Content & Content Moderation
What is illegal content under the DSA?
Illegal content is any information that is not in compliance with Union law or national law. This includes but is not limited to:
- Child sexual abuse material
- Terrorist content
- Illegal hate speech
- Non-consensual sharing of intimate images
- Intellectual property infringements
- Counterfeit goods and trademark violations
- Non-compliant or unsafe products
- Defamation and libel
- Consumer protection law violations
- Other content prohibited by applicable law
Note: The DSA doesn't regulate content that is harmful but not illegal. However, platforms may remove such content under their terms of service.
How do I report illegal content?
Hosting services must provide easy-to-access notice and action mechanisms. A valid notice should include:
- Clear explanation of why you believe the content is illegal
- Exact location of the content (URL or other identifier)
- Your name and contact details
- Statement confirming good faith belief
- Supporting evidence or documentation (if applicable)
Platforms must process your notice and inform you of their decision.
What happens when my content is removed?
When a platform removes content, suspends your account, or otherwise restricts your content, they must provide a statement of reasons including:
- Whether the decision was based on illegal content or terms violation
- Facts and circumstances relied upon
- Information about available redress mechanisms
- Whether automated tools were used in the decision
You have the right to challenge this decision through internal complaints and out-of-court dispute settlement.
Can I challenge a platform's moderation decision?
Yes. You have multiple options:
- Internal Complaints: File a complaint through the platform's internal system (platforms must respond within reasonable time)
- Out-of-Court Dispute Settlement: Engage a certified dispute settlement body if internal complaints don't resolve the issue
- Complaint to Authority: Lodge a complaint with your national Digital Services Coordinator
- Court Action: File legal proceedings in accordance with national law
What are trusted flaggers?
Trusted flaggers are entities (not individuals) awarded special status by Digital Services Coordinators based on demonstrated:
- Particular expertise and competence
- Independence and accountability
- Diligence and accuracy in flagging
Their notices receive priority processing by platforms. They must publish regular reports on their activities.
Transparency & User Rights
What transparency requirements apply to advertising?
Online platforms must ensure that advertisements are:
- Clearly marked as advertising
- Labeled with who paid for the ad
- Accompanied by information about why you're seeing that specific ad (targeting criteria)
Additionally:
- Platforms cannot show targeted ads based on sensitive data (race, politics, religion, health, sexual orientation)
- Platforms cannot show targeted ads to minors
- VLOPs must maintain public repositories of all advertisements
What are recommender systems and how do they affect me?
Recommender systems are algorithms that determine what content you see and in what order. Under the DSA:
- Platforms must clearly explain the main parameters of their recommender systems
- You must be informed about options to modify or influence recommendations
- VLOPs must provide at least one non-profiling option (not based on tracking your behavior)
- This non-profiling option must be equally easy to access
What are dark patterns and why are they prohibited?
Dark patterns are interface designs that manipulate or deceive users into making decisions they wouldn't otherwise make. Examples include:
- Making cancellation much harder than sign-up
- Hiding or obscuring choices
- Pre-selecting options that benefit the platform
- Using urgent language or visual tricks to push decisions
- Repeatedly asking for the same decision after refusal
The DSA prohibits these practices to protect user autonomy and informed choice.
Enforcement & Penalties
Who enforces the DSA?
DSA enforcement involves multiple authorities:
- Digital Services Coordinators: One per Member State, primary enforcement authority for most providers
- European Commission: Directly supervises and enforces against VLOPs and VLOSEs
- European Board for Digital Services: Ensures consistent application across the EU
- Other Competent Authorities: May be designated for specific tasks by Member States
What are the penalties for non-compliance?
Penalties depend on the type of provider and violation:
- VLOPs and VLOSEs: Up to €50 million or 6% of global annual turnover (whichever is higher)
- Periodic Penalty Payments: Up to 5% of average daily turnover for ongoing non-compliance
- Other Providers: Member States determine penalties, which must be effective, proportionate, and dissuasive
Factors affecting fine amounts include severity, duration, intentional character, cooperation with authorities, and previous infringements.
How can I lodge a complaint about a platform?
You can lodge a complaint with your national Digital Services Coordinator if you believe a platform is violating DSA obligations. Contact details for all coordinators are available on the European Commission website and in our Resources section.
Additional Resources
Where can I find more information?
Is this legal advice?
No. This website provides educational information about the DSA. It does not constitute legal advice. For specific compliance questions or legal guidance about your obligations or rights under the DSA, please consult qualified legal professionals or digital services compliance experts.