THE HEARTBROKEN dad of a teen who killed herself after looking at vile social media posts has warned that other kids are still being targeted.
Molly Russell, 14, viewed thousands of disturbing posts in the months leading up to her death in 2017.
Her dad Ian Russell is now campaigning for better internet safety after Molly was able to view suicide and self-harm content online.
He says social media companies are still pushing "harmful content to literally millions of young people" and that "little has changed" since Molly took her life.
Mr Russell said: "This week, when we should be celebrating Molly's 21st birthday, it's saddening to see the horrifying scale of online harm and how little has changed on social media platforms since Molly's death.
"The longer tech companies fail to address the preventable harm they cause, the more inexcusable it becomes.
MORE ON SOCIAL MEDIA
“Six years after Molly died, this must now be seen as a fundamental systemic failure that will continue to cost young lives.
"Just as Molly was overwhelmed by the volume of the dangerous content that bombarded her, we've found evidence of algorithms pushing out harmful content to literally millions of young people.
"This must stop. It is increasingly hard to see the actions of tech companies as anything other than a conscious commercial decision to allow harmful content to achieve astronomical reach, while overlooking the misery that is monetised with harmful posts being saved and potentially 'binge watched' in their tens of thousands."
Suicide prevention charity the Molly Rose Foundation said it had found harmful content at scale and prevalent on Instagram, TikTok and Pinterest.
Most read in The Sun
It said on TikTok, some of the most viewed posts that reference suicide, self-harm and highly depressive content have been viewed and liked over one million times.
Last September, a coroner ruled schoolgirl Molly, from Harrow, north-west London, died from "an act of self-harm while suffering from depression and the negative effects of online content" in November 2017.
The charity's report has been created in partnership with data-for-good organisation, The Bright Initiative, and saw the Foundation collect and analyse data from 1,181 of the most engaged-with posts on Instagram and TikTok that used well-known hashtags around suicide, self-harm and depression.
It warns that it believes there is a clear and persistent problem with readily available and harmful content because many of the harmful posts it analysed were also being recommended by a platform's algorithms.
The report noted that while its concerns around hashtags were mainly focused on Instagram and TikTok, its concerns around algorithmic recommendations also applied to Pinterest.
The said it was concerned that the design and operation of social media platforms was sharply increasing the risk profile for some young people because of the ease with which they could find large amounts of potentially harmful content by searching for hashtags or by being recommended content along a similar theme.
It said platforms were also failing to adequately assess the risks posed by features which enable users to find similarly-themed posts, and claimed that commercial pressures were increasing the risk as sites compete to grab the attention of younger users and keep them scrolling through their feed.
Following coroner's recommendations Instagram's parent company Meta said it supported more regulation of social media.
In response, a Meta company spokesperson said: "We're committed to making Instagram a safe and positive experience for everyone, particularly teenagers, and are reviewing the Coroner's report.
"We agree regulation is needed and we've already been working on many of the recommendations outlined in this report, including new parental supervision tools that let parents see who their teens follow, and limit the amount of time they spend on Instagram.
"We also automatically set teens' accounts to private when they join, nudge them towards different content if they've been scrolling on the same topic for some time and have controls designed to limit the types of content teens see.
"We don't allow content that promotes suicide or self-harm, and we find 98 per cent of the content we take action on before it's reported to us.
"We'll continue working hard, in collaboration with experts, teens and parents, so we can keep improving."
Pinterest has said it will consider "with care" the recommendations made.
A Pinterest spokesperson said: "Our thoughts are with the Russell family.
"We've listened very carefully to everything that the coroner and the family have said during the inquest.
"Pinterest is committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner's report will be considered with care.
"Over the past few years, we've continued to strengthen our policies around self-harm content, we've provided routes to compassionate support for those in need and we've invested heavily in building new technologies that automatically identify and take action on self-harm content.
READ MORE SUN STORIES
"Molly's story has reinforced our commitment to creating a safe and positive space for our Pinners.”
The Sun has contacted TikTok for comment.