Instagram to warn parents when teens search for suicide terms
SAN FRANCISCO, California - Instagram will begin notifying parents when their teenage children repeatedly search for content related to suicide or self-harm, platform owner Meta announced Thursday, as the company faces mounting legal pressure over its handling of young users.
The alerts, rolling out in coming weeks in the United States, Britain, Australia and Canada, will be triggered when a teen makes multiple searches for such terms within a short period of time.
The alerts will be expanded to other regions later in 2026.
Parents using Instagram's parental supervision tools will receive notifications via email, text or WhatsApp, as well as through the app itself, along with expert resources to help them navigate potentially difficult conversations with their children.
Instagram already blocks searches for terms associated with suicide and self-harm, directing users instead to help lines and support organizations. The new alerts are designed to flag cases where teens persistently attempt such searches despite those restrictions.
Meta said it consulted with its Suicide and Self-Harm Advisory Group in setting the threshold for alerts, adding that it had deliberately erred on the side of caution even if that meant some notifications might be sent without genuine cause for concern.
The announcement came as the company faces mounting legal pressure over the use of its platforms by young people.
Meta CEO Mark Zuckerberg testified this month at a landmark trial in California over accusations that his company and others deliberately caused addiction in minors -- the first time such a case has reached a jury.
Meta is also contending with a sweeping global push to restrict children's access to social media, with Australia having banned under-16s from platforms in December and countries including France, Denmark, Spain and the UK racing to introduce similar measures. — Agence France-Presse