The dangerous apps parents NEED to know about as groomers target bored kids in lockdown
WHILE millions of Brits may have shut their doors in lockdown, an army of predators are finding dangerous new ways to target vulnerable children in their homes.
Shocking figures revealed by the NSPCC show that perverts are exploiting a flood of poorly designed mobile phone apps and websites to pursue abuse over live video streams and anonymous chat forums.
In new data released today, the charity found 5,372 grooming offences in the three years up to this March took place largely on Facebook, Snapchat, and Facebook-owned apps such as Instagram and WhatsApp.
However, disturbingly a quarter of such grooming offences (1,677) where the means of communication was known took place on lesser known apps that parents may not even be aware of.
And child protection groups warn that with children bored in lockdown, more will be seeking out new platforms that allow them to meet strangers and make new friends.
Andy Burrows, head of child safety online policy at the NSPCC, says that more must be done to avoid a repeat of the springtime, when cases of online child abuse rose a record 400 per cent worldwide.
“What we saw was a perfect storm of online child abuse,” he tells Sun Online. “Lots of the big platforms had to scale back their moderation resources, more children were spending time online and at the same time abusers saw this as an opportunity.
“Earlier in the year was the highest risk for online child abuse that the NSPCC has ever seen.
“We’ve had years of failed self-regulation and a failure to consider child safety in how services and games are designed and run.”
Here, we reveal some of the apps children and parents may not know about that could be exploited by vile predators.
Omegle
The NSPCC and O2 joint site highlights Omegle, a free chat website which lets unverified users talk one-on-one with a stranger, as particularly dangerous.
“Some of the smaller apps – Omegle is a very good example – have extremely high risk functionality,” says Burrows.
“They use things like live streaming and video chat functions, and seemingly have very poor moderation arrangements.”
Specifically, he points out that the site gives users the ability to filter potential chat partners by interest.
“It’s a very poor design choice which groomers can exploit, because not only can they talk to children, they can talk to them based on their interests,” he says.
“We know that very often how the grooming process works is that you’re able to develop a relationship with children and earn their trust – here, not only can you talk on a live stream, but you can identify what their passions and hobbies are.”
Earlier this year, Davina McCall issued a warning to fans after coming across explicit content on the site.
The TV presenter said she went on Omegle with her daughter Tilly, 16, to see what it was like — and within minutes men sent images of themselves performing sex acts.
“I sat down off camera, she was on camera – out of the five people that she had, bearing in mind it’s 10 o’clock in the morning, two of them were just pictures of men’s groins, with a s****y, w***ing,” Davina said.
“It was the most explicit, horrific thing I have ever… I was like, ‘Oh my God.’
“Obviously Tilly thinks it’s really funny, but if it was a 14-year-old girl I would be genuinely really, really worried.”
Kik
Kik has been described as one of the most worrying apps on the marketplace, with a 2018 BBC report finding the chat app has allegedly been .
While it operates similar to many messaging apps, its ‘Meet New People’ feature lets you chat to random users around the world anonymously.
Last year, the child protection head for the National Police Chiefs’ Council (NPCC) Chief Constable Simon Bailey said “children are at risk” on the app because it allows users to join anonymously without giving a phone number.
The number of referrals police received connected with the app over child abuse images rose from a monthly average of 48 in 2016 to 195 in 2018.
Earlier this summer, after 2,500 Kik chat logs were found on his phone.
Graham Mead was jailed for 30 months at Guildford Crown Court for a number of offences including making and taking indecent images of children.
KidsChat
Online chat forum Kids Chat was also singled out by the NSPCC as particularly open to abuse.
Although designed exclusively for teenagers to talk to each other, the site requires no proof of age, ands users can sign up anonymously, allowing predators to easily join again even after being banned.
One 12-year-old girl recently told Childline how she was bombarded with explicit images after signing up to the site.
“I am 12 and I don’t have social media but I wanted to get online
and chat to people since my friends had done it and told me it would be
fun,” she said.
“It started off fine with the occasional ‘hi’ and then men started
sending d*** pics and saying really personal things.
“I haven’t told anyone because if my parents found out they’d both freak out.”
Whisper
Whisper is a ‘secret sharing’ app where users can anonymously post intimate confessions for strangers to read.
However, it has been criticised over its location settings, which allow users to locate people by their area, making it easier for predators to lure kids into meeting them in real life.
Paedophile hunters after tricking him on Whisper.
Matthew Jack, who had been released from prison just weeks earlier following a similar sting, arranged to meet what he thought was a 15-year-old girl in Gateshead, but he had been talking to the vigilante group Dark Justice.
MeetMe
It’s marketed as a way to ‘meet, chat, and have fun with new people’, but MeetMe has been frequently described as a dating app posing huge risks for kids.
No proof of identity is required at sign-up, meaning both teenagers and predators can easily falsify their age.
The app also has a “Match” feature where users can “secretly admire” others, and allows them to search locally and meet up with each other in person.
MeetMe was just one of 15 increasingly popular apps highlighted by police last year in a warning to parents.
A police spokesman said: “While some of these are completely innocent and we hope your children never experience bullying, harassment or exploitation, time and time again we see people use these apps to target, abuse and manipulate young people – be that sexual, criminal or online bullying.”
Tellonym
Tellonym is one of a number of anonymous Q&A apps, where users can post ‘truthful’ statements about their life and ask for feedback.
In 2018, the reported that two schools issued warnings to parents over the app, raising concerns about cyberbullying and abuse.
They claimed the app allowed “inappropriate postings, comments and photographs which have caused upset and distress to young people”.
The schools added that the police had become involved over a number of cases.
In response, Tellonym said it was “held to high standards by Apple and, by default, Google” and directed users to its .
Ghislaine Bombusa, head of digital at Internet Matters, says children are seeking out apps like Tellonym as they strive for “honest feedback” on important issues in their lives.
“For us, it’s about helping parents to advise their children on where to get that advice safely – whether that be Childline or other organisations.
“A lot of the time, it’s the desire to get advice without a sense of fear or shame that is driving kids to use these apps, which might be more risky than others.”
Monkey
The Monkey app, which is pitched at teenagers, lets users talk face-to-face with random people for 15 seconds and add extra time to the call if they both agree.
In a , CNBC claimed the app shared many worrying similarities with the website Chatroulette, which has been criticised for allowing perverts to easily connect with young people on randomised video calls.
The news site said that among the 25 Monkey users its journalist chatted with, one was engaging in a sexually explicit act while another showed genitals.
Co-founder Isaiah Turner told the site in response to the problems: “Monkey is taking this seriously and being proactive to solve it.”
How to keep your kids safe online
Under 18s using online platforms can prove a dangerous activity if not supervised, as many criminals target children online. Security experts at Specops Software have provided a guide on how to protect your children online
Discuss online safety from an early age
You need to educate both yourself and your child about the different types of dangers they could be facing online, how to spot them and what to do if they’ve been targeted. It is very important that you teach them to be open with you and inform you if they feel like something is wrong.
Know your child’s behaviour
When children become victims of online bullying or other kinds of threats, you will be able to notice a change in their behaviours if you pay attention. Are they more apathetic? Are they in a bad mood after using their device? These are the things that parents need to keep an eye on.
Set rules and always stick to them
As an adult, you should be in control. Set boundaries on what your child can and cannot watch online. Make sure you take steps like talking to your internet provider to block any pornographic content. It is also important to limit their screen time.
Use parental blocks
Most internet providers offer Parental Control to help you keep your children protected. It’s important to not neglect these and make good use of them. It will save you the time and stress of constantly checking certain sites which would automatically be blocked, and it can make sure your child is spending only a certain amount of time on their device.
‘Do your research before it becomes an issue’
The NSPCC says the responsibility must be with tech companies, not just parents, to ensure children stay safe online.
They are calling on the Government not to water down the upcoming Online Harms Bill – which calls for an independent regulator to hold tech companies like Instagram, Snapchat and TikTok to account for neglecting child safety on their platforms.
The white paper, which could mean hefty fines if harmful material is not removed swiftly, was unveiled last year following the death of 14-year-old Molly Russell, who killed herself after viewing online images of self-harm.
However, Bombusa says there are still steps parents can take to ensure their children know the risks of exploring social media.
“What we say to parents is to talk to your child about what they’re doing and if you hear about an app you don’t know, do a bit of research,” she says.
“Have that knowledge and conversation ahead of time, even before they’ve decided to use it, so you have can prevent a situation where the child’s downloaded it, seen something inappropriate and then it becomes an issue.”