A TROUBLED teen carves into her bloodied arm, a suicidal brunette screams into the camera and a girl slashes her sweet smile with a blade.
These are just a few of the sickening posts on Instagram that a Sun Investigation found with just a few clicks.
This week devastated father Ian Russell slammed the social media giant — owned by Facebook — for contributing to the suicide of his 14-year-old daughter Molly.
After showing “no obvious signs” of serious mental health problems, Molly was found dead in her bedroom in November 2017.
After her death her family found out she had been viewing scores of Instagram posts normalising and even romanticising self-harm and suicide.
Mr Russell, of Harrow, North West London, said: “We are very keen to raise awareness of the harmful and disturbing content that is freely available to young people online.
How to get help
Childline: 0800 1111 or visit
Samaritans: 116 123 call FREE from any phone, even a mobile without credit, or email [email protected] or visit
“Not only that, but the social media companies, through their algorithms, expose young people to more and more harmful content, just from one click on one post.”
Even a quick search on Instagram confirms Mr Russell’s fears.
And the language users adopt on the site is designed to fool any worried parent trying to keep an eye on their child’s wellbeing.
Hashtags of girls’ names used on the site might appear innocuous, but are actually abbreviations for severe mental health issues.
TROUBLED TEENS WALLOW IN HATRED OF THEMSELVES
Ana stands for anorexia, Annie means anxiety, Bella means borderline, Sophie means schizophrenia and Sue stands for suicidal.
Sickeningly, thousands of images and videos from children’s cartoons are being used on Instagram to glorify anxiety, depression, self-harm and suicidal behaviour.
There is also a wealth of posts bearing the hashtag #sadsimpsons, for example. Parents who check their children’s search history are unlikely to be worried by this — but the content is terrifying.
In one bleak black and white video, a sad Homer Simpson has a noose around his neck, next to the hashtags #hatemyself and #ritzen” — a German word for self-harm.
In another, Homer is seen plunging from a high-rise building.
And in a sick take on AA Milne’s beloved book characters, Eeyore’s grey body is seen hanged from a tree as Winnie the Pooh and Tigger stare on. The caption reads: “I can’t believe he actually killed himself.”
And while searches for #selfharm #suicide will bring up a warning screen to deter users from viewing images, users can simply press “view images anyway”.
How to get help
Childline: 0800 1111 or visit
Samaritans: 116 123 call FREE from any phone, even a mobile without credit, or email [email protected] or visit
Once in, an onslaught of gory images and videos which resemble scenes from horror movies jump out of the page. Except these are all real, and many have garnered dozens of “likes” and appreciative comments.
There are also memes featuring text with grim, hopeless messages including: “People told me, ‘Just kill yourself’. I’m trying” and “How to kill yourself”.
The British trade body for advertisers, ISBA, has raised concerns about adverts appearing alongside Instagram posts.
Instagram has more than two million advertisers including brands such as H&M, Deliveroo, Nike, Domino’s and Sainsbury’s.
The minimum age to sign up to the site is 13 years old, but this is impossible to enforce. And while Instagram claims to be diligent about removing graphic posts, some of the worst we saw have been up for at least ten days.
The online comments are even more disturbing. Troubled teens wallow in self-hate, calling themselves ugly, fat and unlovable. One girl wrote under a video of a noose: “I wish I could hang myself but I am so fat I can’t even do that.”
Under her comment, a poster wrote: “Dying is the answer. People making fun of u. I legit have tried and I’m 12. Life is bad.”
Many posts from those in distress draw comments designed to humiliate or upset them even further, in a practice known as ”roasting”.
How to protect your children
CHILDLINE offers plenty of advice and information for parents about tackling online problems their children may face. It has devised “TEAM” – a four-point strategy which can help you discuss any internet issues with your kids:
T ALK to your child regularly about what they are doing online and how to stay safe. Let them know they can come to you or another trusted adult if they are feeling worried or upset by anything they have seen.
E XPLORE your child’s online activities together. Understand why they like using certain apps, games or websites and make sure they know what steps they can take to keep themselves safe.
A GREE your own rules as a family when using sites, apps and games.
M ANAGE your technology and use the privacy and parental control settings available to keep your child safe.
If you are worried that your child may be self-harming:
TALK to your GP: They can treat their injuries and refer your child to specialists such as therapists who will work with your child to discuss their thoughts and feelings and how this is affecting their behaviour.
SPEAK to your child’s school: The person in charge of child protection for the school should be able to provide a named member of staff who your child can go to if they are struggling with low mood or wanting to harm themselves.
TELL your child about Childline: Childline’s free 24/7 service allows young people to talk to specially trained counsellors about the emotions they may be feeling.
Now Molly’s family is campaigning for social media sites to review content and make it harder for teens to view damaging content.
The teen’s story has chilling similarities to the case of Milly Tuomey, 11. Before taking her own life in January 2016, the Dublin youngster scrawled “beautiful girls don’t eat” across her body in pen before posting haunting diary entries on Instagram detailing her plan to die.
Milly’s mother Fiona Tuomey, who founded the Healing Untold Grief Group, told The Sun: “Suicide is a complex issue which cannot be attributed to just one factor.
“Social media is an integral part of young people’s communication.
“It’s time governments held these global companies to account. Redirecting people to help sites is simply not good enough.
“The social media giants have the power and technology to stop this.”
The UK has the highest self-harm rate of any country in Europe — and the majority of those affected are aged between 11 and 25.
‘SEEING OTHERS SELF-HARM NORMALISES IT’
Instagram addict Nicole Simone, 21, a barmaid from Dover, Kent, began self-harming at 13 and blames social media for making her mental health issues worse.
She said: “I follow some really dark accounts and have looked at self-harm posts. It just makes my mental state worse and pushes me to want to hurt myself. Seeing other people hurting themselves normalises self-harm.”
Meanwhile in York, psychology student and recovering anorexic Talia Sinnott, 21, said: “Instagram only encouraged my negative thoughts, progressing my illness to the point where I was hospitalised, weighing less than 6st.
“I was naive and clueless and came across ‘pro anorexia’ pages. They taught me how to cheat my parents into thinking I was fine.”
What the NSPPCC demands
THE NSPCC’s Wild West Web campaign is demanding that the Government regulates social media to make the internet safe for young people. It wants:
A REGULATOR to hold social networks to account
REPORTS by social networks on the risks on their sites
FORCE social networks to tackle grooming
Find out more on this link:
To learn about social networks your kids use, visit
Children and young people with worries can contact the free and confidential Childline 24 hours a day, 365 days a year on 0800 1111 or visit childline.org.uk
Andy Burrows, NSPCC’s associate head of child safety online, told The Sun yesterday: “We call on the Government to introduce new laws that force social networks to protect children from harmful content and abuse online, and to fine them when they fail.” Last night Instagram launched an investigation into The Sun’s findings — although a spokesman insisted some of the images could be BENEFICIAL to vulnerable users.
She said: “We do not allow content that promotes or glorifies eating disorders, self-harm or suicide and will remove it.
“Mental health is a complex issue and we work closely with experts who advise us on our approach.
“They tell us the sharing of a person’s mental health journey or connecting with others who have battled similar issues can be an important part of recovery.
“This is why we don’t remove certain content and instead offer people looking at, or posting it, support messaging that directs them to groups that can help.”
How to get help
Childline: 0800 1111 or visit
Samaritans: 116 123 call FREE from any phone, even a mobile without credit, or email [email protected] or visit