Chapter 2|Liability of Providers of Intermediary Services|📖 4 min read
1. Where an information society service is provided that consists of the storage of information provided by a recipient of the service, the service provider shall not be liable for the information stored at the request of a recipient of the service, on condition that the provider:
(a) does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; or
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content.
2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider.
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in such a way that it could lead consumers to believe that the information, or the product, service or activity that is the object of the transaction, was provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Understanding This Article
Article 6 establishes the liability framework for hosting services - the broadest and most important category of intermediary services. Hosting includes everything from basic web hosting and cloud storage to complex platforms like social media, video sharing sites, and online marketplaces.
The core principle is conditional exemption: hosting providers aren't liable for user-uploaded content UNLESS they have actual knowledge of illegality. 'Actual knowledge' means clear, specific awareness - general knowledge that 'some content might be illegal' isn't sufficient. Once providers gain actual knowledge (typically through valid notices, court orders, or own-initiative discovery), they must 'act expeditiously' to remove or disable access. What counts as 'expeditious' depends on circumstances - CSAM requires immediate removal; complex copyright disputes might allow brief review periods.
Paragraph 2 addresses situations where users act under provider authority or control. If a platform employs content creators, commissions specific content, or exercises editorial control, it loses exemption protection for that content - it's effectively the publisher, not merely a hosting service.
Paragraph 3 creates special marketplace liability. When platforms present products so that consumers believe the platform itself sells them (like Amazon mixing its own products with third-party sellers), the platform can be liable under consumer protection law even without actual knowledge. This prevents platforms from profiting as apparent sellers while disclaiming responsibility.
Key Points
Hosting providers not liable for stored content unless they have actual knowledge
Must act expeditiously to remove illegal content once aware
Exemption lost if content provider acts under hosting provider's authority
Special marketplace liability for misleading presentations
Applies to cloud storage, web hosting, social media platforms
Knowledge requirement protects passive hosting without editorial control
Practical Application
For Cloud Storage: Dropbox, Google Drive, and OneDrive store user files but typically don't review content. They're not liable for illegal files until notified. However, if they scan for and detect illegal content (like CSAM using hash matching), they have actual knowledge and must act.
For Social Media: Facebook, Twitter, and Instagram host billions of user posts. They're not liable for each post, but must remove illegal content once properly notified. Automated content moderation providing actual knowledge (e.g., detecting known terrorist propaganda) triggers removal obligations.
For Video Platforms: YouTube hosts user videos. When copyright holders send valid DMCA notices, YouTube gains actual knowledge and must expeditiously remove infringing videos. YouTube's Content ID system providing actual knowledge of copyright infringement requires action.
For Marketplaces: If Amazon presents third-party products without clearly identifying the seller, or if the presentation suggests Amazon sells the item, Amazon could be liable under consumer protection law even without knowledge of product defects. Clear seller identification and disclosure of third-party status protects against this liability.
For Web Hosting: Companies like GoDaddy and Bluehost hosting customer websites aren't liable for website content. However, if validly notified that a hosted site distributes malware or illegal content, they must act expeditiously - typically by suspending the site or requiring the customer to remove offending content.
Expeditious Action Examples: For CSAM, immediate removal (minutes to hours) is required. For complex trademark disputes, 24-48 hours for initial review may be acceptable. For court orders, compliance within the time specified by the court is required. Platforms should document response procedures and timelines.
Loss of Protection: If a hosting provider reviews and approves content before publication, selects content for promotion, or pays users to create specific content, it exercises editorial control and loses hosting exemption for that content.