APP SHAME

Instagram blasted over dangerous ‘eating disorder hashtags’ after AI detection system failed

A machine learning gaffe left at-risk Instagam users exposed to dangerous hashtags

INSTAGRAM has been criticised after an artificial intelligence gaffe meant shocking hashtags promoting eating disorders were easily accessible.

Almost a dozen hashtags being used to share “unhealthy and dangerous attitudes towards food and body image” were found circulating on the site – without any warnings.

Alamy
Instagram has been accused of failing to protect its most vulnerable users

When you search for terms like “anorexia”, “proanorexia” or “bulimia” on Instagram, you’ll be blocked by a warning message asking if you need help.

This policy has been in place since 2016, to protect users looking for sensitive topics on the site.

But an investigation by uncovered similar hashtags being used to share rogue content, which had slipped through Instagram’s filter systems.

“The hashtag search terms were slight variations or different spellings on others that have been flagged up,” the report explains.

Getty - Contributor
Hashtags used to promote dangerous content around eating disorders were freely accessible in the Facebook-owned app

Speaking to Sky News, Daniel Magson, a former bulimic and vice chair of Anorexia & Bulimia Care charity, said: “It is incredibly dangerous and a real health risk.

“It’s not a safe space at all and these communities are promoting things like ‘these are the best places to dine with private toilets for afterwards’.

“They promote the best ways to injure or self-harm and that should not be allowed.”

Normally searching for terms around eating disorders launches a pop-up.

It reads: “Can we help? Posts with words or tags you’re searching for often encourage behaviour that can cause harm and even lead to death.

“If you’re going through something difficult, we’d like to help.”

Users can then choose to “See posts anywhere”, or select “Get Support”.

The latter takes you through to a page that recommends talking to a friend, chatting with a helpline volunteer, or getting tips and support on how to “support yourself”.

Instagram has been using artificial intelligence and machine learning systems for the last six months to root out rogue hashtags relating to sensitive topics.

The systems find content that’s likely to be deemed sensitive, by comparing post information to existing sensitive information. These posts and hashtags are then flagged up to Instagram.

According to Sky News, Instagram described these systems as a “work in progress”, and has now added warnings to the offending hashtags.

Supporting someone with an eating disorder

Here's the official advice from the NHS

  • If your friend or relative has an eating disorder, such as anorexia, bulimia or binge eating disorder, you will probably want to do everything you can to help them recover.
  • You’re already doing a great job by finding out more about eating disorders and how to try to support them – it shows you care and helps you understand how they might be feeling.
  • Getting professional help from a doctor, practice nurse, or a school or college nurse will give your friend or relative the best chance of getting better. But this can be one of the most difficult steps for someone suffering from an eating disorder, so try to encourage them to seek help or offer to go along with them.
  • You can support them in other ways, too:
  • Keep trying to include them – they may not want to go out or join in with activities, but keep trying to talk to them and ask them along, just like before. Even if they don’t join in, they will still like to be asked. It will make them feel valued as a person.
  • Try to build up their self-esteem – perhaps by telling them what a great person they are and how much you appreciate having them in your life.
  • Give your time, listen to them and try not to give advice or criticise – this can be tough when you don’t agree with what they say about themselves and what they eat. Remember, you don’t have to know all the answers. Just making sure they know you’re there for them is what’s important. This is especially true when it feels like your friend or relative is rejecting your friendship, help and support.

In a statement, an Instagram spokesperson said: “We care deeply about making Instagram a place where people feel empowered, inspired and comfortable to express themselves.

“Every day, millions of people use Instagram to strengthen relationships with friends and build communities of support, particularly around body image.

“Instagram was created to foster a safe, kind and supportive community and we’re committed keeping it so.”

In a separate statement sent to The Sun, an Instagram spokesperson said: “We care deeply about the wellbeing of people who use Instagram and do not tolerate content that encourages eating disorders.

“We urge people who see this kind of content to use our in-app reporting tools so we can swiftly review it and we prioritise all reports related to eating disorders.”

They continued: “We also recognise this is a complex issue and we want people struggling with their mental health to be able to access support on Instagram when and where they need it.

“We therefore go beyond simply removing content and hashtags and take a holistic approach by offering users looking at or posting certain content the option to acces tips and support, talk to a friend or reach out directly to PAPYRUS UK or the Samaritans.

“Experts we work with tell us that communication is key in order to create awareness, and that coming together for support and facilitating recovery is important.”

Content that promotes eating disorders is against Instagram’s Community Guidelines, and Instagram says it is removed when found.

Instagram says it works with relevant charities, and uses a mix of user-reporting and computerised flagging systems to quickly find inappropriate content.

Instagram’s eating disorder gaffe mirrors a similar hashtag blunder uncovered by The Sun earlier this year.

We exposed secret Instagram sex hashtags that were being used to share hardcore porn videos around the Facebook-owned app.

The Sun found shocking videos depicting full sex with genitals in clear view, while others showed oral sex or masturbation.

One clip even showed a bestiality scene involving an adult woman and a horse – which is illegal to distribute in the UK.

Others didn’t necessarily depict nudity, but included male ejaculation or close-up crops on hardcore sex scenes – leaving genitals just out of shot.

Importantly, Instagram is aimed at users aged 13 and over, so this content was highly inappropriate.

MOST READ IN TECH

PINK TWICE
Stunning pink lake is one of the world's DEADLIEST & hides a mysterious secret
IN CONTROL!
Find your lost Amazon Fire Stick remote in seconds just by saying magic phrase

Speaking to The Sun at the time, Andy Burrows, the NSPCC’s Associate Head of Child Safety Online, said Instagram wasn’t doing enough to protect kids.

“Instagram’s rules ban pornography but clearly its moderation systems aren’t removing content that it should. Instagram should proactively filter out content which breaks its own rules.

“Young people on Instagram should never be exposed to this kind of adult content, some of which includes bestial themes.

“Following the NSPCC’s Wild West Web campaign, Government announced it will bring in new safety laws for social networks. The new Digital Secretary Jeremy Wright must make sure these laws are fit for purpose, and are backed by an independent regulator with teeth.”

Facebook, which owns Instagram, is doubling the number of people working across safety and security to 20,000 by the end of 2018, including a team of 7,500 content reviewers.

For more information about anorexia and getting help, visit .

Do you think Instagram needs to do more to clean up its act? Let us know in the comments!


We pay for your stories! Do you have a story for The Sun Online news team? Email us at tips@the-sun.co.uk or call 0207 782 4368 . We pay for videos too. Click here to upload yours.


Exit mobile version