google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
USA

Instagram to start parent alerts for teen suicide, self-harm searches

Mark Zuckerberg, CEO of Meta Platforms Inc., leaves the Los Angeles Superior Court on Wednesday, February 18, 2026 in Los Angeles, California, United States.

Kyle Grillot | Bloomberg | Getty Images

Instagram said Thursday that it will warn parents when teens repeatedly search for the terms suicide and self-harm as its parent company. Meta It is examined through more than one trial.

“These alerts are designed to let parents know if teens are repeatedly trying to search for this content and provide them with the resources they need to support their teens,” the company said in a statement.

The parental control feature comes as the social media company faces allegations that the design and functionality of apps such as Instagram have detrimental effects on the mental health of young users.

Experts described cases and related legal cases involving companies such as: Google’s YouTube, TikTok and explode It is seen as the social media industry’s “big tobacco” moment as courts weigh the alleged harms of its products and alleged efforts to mislead the public about those adverse effects.

Instagram alerts will be available next week in the US, UK, Australia and Canada.

The company said in a blog post that parents will receive alerts if teens repeatedly search for “phrases that encourage suicide or self-harm, phrases that suggest a teen wants to harm themselves, and terms like ‘suicide’ or ‘self-harm'” over a “short period of time.”

The company called it “the right starting point” as it tries to find the right threshold for what it means to send an alert. Meta said parents may receive alerts that may not indicate a real cause for concern, but that it will continue to listen to feedback on the feature.

Alerts will be delivered to parents via email, text, WhatsApp or Instagram.

The alert feature requires both parents and teens to sign up for Instagram’s parental control tools.

Parents who receive the alerts will see a message explaining their teen’s Instagram search habits and have the option to view additional resources for help, the company said.

Meta said it plans to eventually issue similar parental warnings “for certain AI experiences” aimed at notifying parents “if a teenager attempts to engage in certain types of conversations with our AI about suicide or self-harm.”

Parental warnings about future AI come in the wake of growing concerns that AI chatbots from various tech companies, such as OpenAI and Meta, are engaging in questionable and potentially harmful activities related to mental health. conversations with users.

Meta offers its own AI chatbots and is working on a powerful new AI model codenamed Avocado that will be released later this year, CNBC reported.

Meta CEO Mark Zuckerberg testified in Los Angeles Superior Court last week as part of a hearing in which a plaintiff alleged he became addicted to social media apps like Instagram when he was underage.

During his testimony, Zuckerberg reiterated his view that Meta is a favorite among mobile operating system and related app store owners Apple and Google is more apt to verify users’ ages than app developers.

Regarding age verification, Federal Trade Commission It said Wednesday it would not enforce actions related to the Children’s Online Privacy Protection Rule, or COPPA Rule, against “certain website and online service operators” who collect user data that could be used to inform age verification technologies.

The FTC said the policy statement is part of a broader review of the COPPA Rule as it relates to age verification.

Legal filings released last week as part of a separate Meta-related lawsuit in New Mexico detail internal messages from employees discussing how the company’s encryption efforts could make it more difficult for reports of child sexual abuse material to be disclosed to authorities.

Meta has denied the allegations in both the California and New Mexico cases.

Last week, CNBC reported that the National Parent Teachers Association had not renewed its funding relationship with Meta due to several legal challenges the company has faced regarding digital safety for children.

If you have suicidal thoughts or are in distress, Suicide and Crisis Lifeline Call 988 for support and assistance from a trained counselor.

WRISTWATCH: The worst outcome in the Meta LA trial is the structural changes in the company’s practices.

Worst outcome in Meta LA trial is structural changes to its apps: Big Technology's Kantrowitz

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button