Instagram to Notify Parents When Teens Search for Harmful Content

Technology|26/2/2026
Instagram to Notify Parents When Teens Search for Harmful Content
Instagram
Listen to this story:
0:00

Note: AI technology was used to generate this article's audio.

  • Instagram to Alert Parents if Teens Search for Self-Harm Content
  • The move aims to protect children amid rising government pressure worldwide.

Instagram announced that it will start alerting parents if their teenagers repeatedly search for content related to self-harm starting next week.

The notifications will initially apply to accounts in the UK, US, Australia, and Canada, with plans to expand globally.

The company said in a statement that the alerts will be sent to users enrolled in the “Teen Accounts” trial in the UK, the US, Australia, and Canada, with plans to expand the service to the rest of the world later.

It added: “These alerts are designed to strengthen our ongoing efforts to protect teens from content that could harm them on Instagram,” emphasizing its strict policies against content that promotes or glorifies self-harm.

Until now, the platform already blocked such searches and directed users to support resources, while parents could add an extra layer of monitoring with their teen’s consent.

The move follows growing international pressure, after Australia banned social media for under-16s and other countries, including the UK, Spain, Greece, and Slovenia, are considering similar restrictions.