Article 28

Online protection of minors

1. Providers of online platforms accessible to minors shall take appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service.

2. Providers of online platforms shall not present advertisements on their online interfaces based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor.

3. The obligations set out in this Article shall not oblige providers of online platforms to process additional personal data in order to assess whether the recipient of the service is a minor.

4. The Commission may issue guidelines to assist providers of online platforms in complying with the obligations set out in paragraphs 1 and 2.

Understanding This Article

Article 28 establishes special protection for minors using online platforms, recognizing children's particular vulnerabilities and developmental needs. However, it does so carefully to avoid requiring platforms to collect extensive age data, balancing child protection with privacy.

Paragraph 1 requires 'appropriate and proportionate measures' - deliberately flexible phrasing acknowledging different platforms pose different risks. TikTok (popular with teens) needs more robust protections than LinkedIn (professional network). Gaming platforms need measures addressing in-game interactions; video platforms need measures for viewing recommendations.

'High level of privacy, safety and security' encompasses multiple dimensions: Privacy (protecting minors' personal data, limiting data collection), Safety (protecting from harmful content, predators, exploitation), Security (protecting accounts from takeover, protecting from malware/scams). Platforms must address all three holistically.

Paragraph 2 PROHIBITS targeted advertising to minors using profiling. While adults can be shown ads based on behavioral targeting and personal data analysis, minors cannot. If TikTok knows with reasonable certainty a user is 14 years old, it cannot show ads based on that user's interests, behavior, or profile - only contextual ads (ads based on current content being viewed, not user profile).

'Aware with reasonable certainty' is key qualifier - platforms aren't strictly liable for all minors who slip through. But if platform knows user is minor (user stated age, age verification), or should reasonably know (user behaves/presents as child), the prohibition applies.

Paragraph 3 is THE data minimization safeguard. Article 28 doesn't require platforms to implement mandatory age verification collecting extensive identity documents. Platforms shouldn't collect MORE data about users just to comply with Article 28. If platform doesn't currently verify ages, it needn't start doing so - but if it knows users are minors through existing data, protections must apply.

This creates nuanced compliance: Platforms with existing age data must protect identified minors. Platforms without age data can continue without invasive verification, but must still implement general safety measures for likely minor users on platforms accessible to children.

Key Points

  • Platforms accessible to minors must implement appropriate privacy, safety, security measures
  • Measures must be proportionate to platform type and risks
  • PROHIBITS targeted advertising to minors based on profiling/personal data
  • Platforms need not collect additional data to determine if user is minor
  • Balances child protection with data minimization principles
  • Commission may issue implementation guidelines

Practical Application

For TikTok (Youth-Focused Platform): TikTok knows many users are minors. Must implement: (1) Age-appropriate default privacy settings (private accounts for under-16s); (2) Restricted DM functionality for minors; (3) Proactive content filtering removing sexual/violent content likely to reach minors; (4) Robust reporting tools for predatory behavior; (5) For identified minors (users who stated age < 18), NO targeted ads - only contextual ads based on video being watched, not user profile/history.

For Instagram: Instagram has age data (users enter birthdate at signup). For users under 18: (1) Default private accounts; (2) Adults cannot message minors unless minor follows them; (3) Minors cannot be tagged by adults they don't follow; (4) Sensitive content filter set to most restrictive; (5) For profiled minors, show only contextual ads (e.g., show fashion ad on fashion content, not based on user's shopping history or interests).

For YouTube: YouTube has age-restricted content system and knows some users are minors (under-13s require parental consent via Family Link). For minors: (1) Age-restricted content blocked; (2) Comments disabled on minor-created content; (3) Autoplay to age-appropriate content only; (4) No targeted ads based on watch history for identified minors - only contextual ads matching current video topic.

For Gaming Platforms: Platforms like Roblox with substantial minor user bases must: (1) Robust chat filtering and moderation; (2) Restrictions on in-game purchases for minors; (3) Parent/guardian controls and oversight tools; (4) Safety education integrated into platform; (5) No behavioral ad targeting for minors - promotional content based on game genre only, not player behavior patterns.

For Platforms Without Age Data: A platform that doesn't collect ages still must implement paragraph 1 general protections if accessible to minors. If platform has content likely to attract children (games, cartoons, teen topics), it should implement age-appropriate safety measures even without verifying each user's age: content moderation, reporting tools, privacy-protective defaults. But without knowing specific users are minors, targeted ad prohibition doesn't apply user-by-user.

For Adult-Only Platforms: Platforms that verify users are 18+ and exclude minors through age gates can be deemed not 'accessible to minors' under Article 28(1), potentially reducing obligations. However, age verification must be effective - self-declaration ('Are you 18?') likely insufficient. More robust verification (ID checks, age estimation technology) needed to confidently exclude minors.

For Commission Guidelines: Per paragraph 4, Commission might issue guidance: 'Appropriate measures for social platforms include: default private accounts for minors, restricted messaging, proactive content filtering, safety education. For gaming: chat moderation, spending limits, reporting tools. Proportionate means larger platforms and higher-risk services need more comprehensive measures.'