
The European Commission has launched a formal investigation into Snapchat under the Digital Services Act (DSA), signaling a heightened regulatory focus on how social media platforms protect underage users. This probe underscores the shift from reactive content moderation to proactive systemic risk management, carrying significant financial and reputational implications for global tech enterprises.
On March 25, 2026, the European Commission opened formal proceedings to determine if Snapchat has breached the Digital Services Act (DSA) regarding the protection of minors. The investigation centers on five critical areas: inadequate age assurance, the grooming and recruitment of minors for criminal activities, unsafe default account settings, the dissemination of illegal or age-restricted products (such as drugs and vapes), and non-transparent reporting mechanisms for illegal content.
Regulators suspect that Snapchat’s reliance on user self-declaration for age verification is insufficient to prevent children under 13 from accessing the platform or to ensure an age-appropriate experience for teenagers. Furthermore, the Commission is concerned that the platform’s design—specifically features like "Find Friends" and default push notifications—may expose minors to harmful contact and exploitation. This action follows preliminary information gathered from various national authorities, highlighting a coordinated European effort to enforce high safety standards across the digital landscape.
This investigation marks a pivotal moment in the enforcement of the DSA, moving beyond general content concerns to specific, systemic design flaws that impact child safety. For international enterprises and public sector organizations, the probe highlights that "compliance" now requires more than just updated terms of service; it demands "privacy by design" and robust, verifiable age-assurance technologies. The Commission has made it clear that relying on "self-declaration" is no longer a viable defense under Article 28 of the DSA.
The potential repercussions for Snapchat are severe, with the DSA allowing for fines of up to 6% of global annual turnover if a platform is found to be in breach. This case also signals that the EU is prepared to use its "upstream" regulatory powers to force platforms to re-engineer their core functionalities—such as recommendation algorithms and default privacy settings—to mitigate risks before harm occurs. Organizations operating in the EU must now treat child safety and age verification as high-priority compliance risks, subject to intense regulatory scrutiny and significant legal exposure.
The EU Commission’s investigation into Snapchat serves as a stark warning to all digital platforms: reactive compliance is dead. Under the Digital Services Act (DSA), regulators now expect proactive, systemic risk management, especially concerning vulnerable users. Relying on a simple "I am over 13" checkbox for age verification is no longer legally defensible.
For organizations, this signals a mandatory shift toward true Privacy by Design. It is no longer just about updating your terms of service; it is about examining how your core features, default settings, and algorithms fundamentally operate.
Review your organization’s privacy program to ensure it meets "privacy by design" standards. Learn how to strengthen your organization's data privacy, security, and governance posture with best practices and technology in this Privacy Program Assessment Checklist.