Investigation Opened After Finding Children at Risk on Snapchat

Technology|28/3/2026
Investigation Opened After Finding Children at Risk on Snapchat
Snapchat
Listen to this story:
0:00

Note: AI technology was used to generate this article's audio.

  • European Commission launches investigation into Snapchat
  • Concerns grow over child safety and age-restricted products

The European Commission has announced a full-scale investigation into Snapchat, owned by the American tech company Snap, following warnings that the platform may not be doing enough to prevent child exploitation and the sale of illegal products.

The probe is being conducted under the Digital Services Act, which requires major online platforms to take stricter measures against harmful or illegal content, with potential fines of up to 6% of their global annual revenue for non-compliance.

Hanna Virkkunen, the EU’s technology officer, said in a statement: “From luring children and exposing them to illegal products to account settings that reduce the safety of minors, it appears that Snapchat has neglected the high safety standards required under the Digital Services Act for all users.”

Snapchat responded by stating that it continuously reviews and strengthens its safety measures. A spokesperson said: “We have fully cooperated with the Commission so far—engaging proactively and transparently, acting in good faith to meet the high safety standards mandated under the Digital Services Act—and we will continue to do so throughout the investigation.”

The Commission also highlighted that Snapchat’s current content management tools are insufficient to prevent the spread of information directing users toward illegal products, such as controlled substances, or age-restricted items like e-cigarettes and alcohol.

The investigation will take over the case initially opened by Dutch authorities last September regarding the sale of e-cigarettes to minors on Snapchat.

Additional EU concerns include the platform’s self-declared age verification tool, deemed inadequate by regulators, as well as default account settings, and reporting mechanisms that may mislead users due to the platform’s design.