• Home
  • News
  • Coins2Day 500
  • Tech
  • Finance
  • Leadership
  • Lifestyle
  • Rankings
  • Multimedia
TechInstagram
Europe

After a 14-year-old Instagram user’s suicide, Meta apologizes for (some of) the self-harm and suicide content she saw

Sophie Mellor
By
Sophie Mellor
Sophie Mellor
Down Arrow Button Icon
Sophie Mellor
By
Sophie Mellor
Sophie Mellor
Down Arrow Button Icon
September 27, 2022, 11:11 AM ET
Elizabeth Lagone
Meta's head of health and well-being Elizabeth Lagone, arrives at Barnet Coroner's Court, north London.Beresford Hodge—PA Images/Getty Images

A senior executive at Meta apologized on Monday for allowing a British teenager to view graphic Instagram posts related to self-harm and suicide before she took her own life.

Meta’s head of health and wellbeing, Elizabeth Lagone, told an inquest looking into the circumstances surrounding the death of Molly Russell at the North London Coroner’s court that the teenager had “viewed some content that violated our policies and we regret that.”

Russell was a 14-year-old girl from Harrow, London, who killed herself in November 2017 after viewing a large amount of content on Meta’s Instagram and Pinterest relating to anxiety, depression, and self-harm. In the last six months of her life, Russell had saved 16,300 images on her Instagram account, 2,100 of which were linked to depression and suicide.

Lagone told the court, “We are sorry that Molly saw content that violated our policies, and we don’t want that on the platform,” but stopped short of condemning all the controversial content on Russell’s Instagram.

Lagone argued it is not “a binary question” whether the self-harm and depression-related material viewed by Russell—and that was found to comply with Meta’s policies—was safe for children to see, saying to the court that “some people might find solace” in knowing they were not alone.

The senior coroner in the inquest Andrew Walker interrupted the proceeding to ask Lagone, “So you are saying yes, it is safe...?” Lagone replied, “Yes, it’s safe.”

Nuanced and complicated content

After going through a large number of posts relating to suicide and self-harm that were saved, liked, and shared by Molly in the final six months of her life, Lagone argued that most were “by and large admissive,” because many of those individuals were recounting their experiences with mental health struggles and potentially making a cry for help.

Lagone argues Instagram had heard “overwhelmingly from experts” that the company should “not seek to remove [certain content linked to depression and self-harm] because of the further stigma and shame it can cause people who are struggling,” noting the content was “nuanced” and “complicated.”

Russell family’s barrister Oliver Sanders grew heated in response to Lagone’s answers, asking, “Why on earth are you doing this?... You’ve created a platform that’s allowing people to put potentially harmful content on it [and] you’re inviting children on to the platform. You don’t know where the balance of risk lies.”

“You have no right to. You are not their parent. You are just a business in America,” Sanders argued.

At the time of Russell’s death, Instagram allowed graphic posts to be posted on their platform referencing suicide and self-harm, which they claim created a space for people to seek help and support. In 2019, it U-turned on this policy and banned all graphic images of self-harm, noting at the time, “collectively it was advised [by mental health experts] that graphic images of self-harm – even when it is someone admitting their struggles—has the potential to unintentionally promote self-harm.”

Social media platforms have so far operated in a regulatory wild west which has prioritized fast growth, engagement, and time spent on their platform viewing content over many safety features. But now momentum is building against giant tech companies for more oversight on how algorithms spread content that may be harmful to the users who engage with it.

One of the most notable whistleblowers in the space, Frances Haugen, leaked a huge trove of internal data in the Facebook Papers report, which found that the social media giant Meta—then still called Facebook—disregarded user safety in the pursuit of profit.

Internal research within the company found that Instagram was especially harmful to a large portion of young users, and was having a profoundly negative impact on teenage girls.

Ian Russell, the father of Molly Russell, told the inquest last week that he believes algorithms on the social media platforms pushed his daughter towards graphic and disturbing posts and contributed to her death.

The Meta spokesperson told Coins2Day that that between April and June 2022, the company took action on 98.4% of suicide or self-harm content identified on Instagram before it was reported to them—up from 93.8% two years ago.

“We’ve never allowed content that promotes or glorifies suicide and self harm, since 2019 alone, we’ve updated our policies, deployed new technology to remove more violating content, shown more expert resources when someone searches for, or posts, content related to suicide or self-harm, and introduced controls designed to limit the types of content teens see,” the spokesperson said.

Update, Sept. 27, 2022: This article was updated after publication with a statement from a Meta spokesperson.

Sign up for the Coins2Day Features email list so you don’t miss our biggest features, exclusive interviews, and investigations.

About the Author
Sophie Mellor
By Sophie Mellor
LinkedIn iconTwitter icon
See full bioRight Arrow Button Icon
Rankings
  • 100 Best Companies
  • Coins2Day 500
  • Global 500
  • Coins2Day 500 Europe
  • Most Powerful Women
  • Future 50
  • World’s Most Admired Companies
  • See All Rankings
Sections
  • Finance
  • Leadership
  • Success
  • Tech
  • Asia
  • Europe
  • Environment
  • Coins2Day Crypto
  • Health
  • Retail
  • Lifestyle
  • Politics
  • Newsletters
  • Magazine
  • Features
  • Commentary
  • Mpw
  • CEO Initiative
  • Conferences
  • Personal Finance
  • Education
Customer Support
  • Frequently Asked Questions
  • Customer Service Portal
  • Privacy Policy
  • Terms Of Use
  • Single Issues For Purchase
  • International Print
Commercial Services
  • Advertising
  • Coins2Day Brand Studio
  • Coins2Day Analytics
  • Coins2Day Conferences
  • Business Development
About Us
  • About Us
  • Editorial Calendar
  • Press Center
  • Work At Coins2Day
  • Diversity And Inclusion
  • Terms And Conditions
  • Site Map

© 2025 Coins2Day Media IP Limited. All Rights Reserved. Use of this site constitutes acceptance of our Terms of Use and Privacy Policy | CA Notice at Collection and Privacy Notice | Do Not Sell/Share My Personal Information
FORTUNE is a trademark of Coins2Day Media IP Limited, registered in the U.S. and other countries. FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.