Google Image ‘safe’ search shows PORN STARS when you search for ‘hot’
The machine learning gaffe has led to children being exposed to very inappropriate pictures
GOOGLE is almost exclusively showing pictures of women in their underwear when you search for the term "hot", which campaigners have called "concerning".
This happens even when Google's Safe Search is turned on, which the site says should "filter explicit results".
You'd expect that Google Images would show pictures of sunny beaches, ice creams, and roaring fires when searching for the term "hot".
But instead, users are greeted with racy pics of near-nude women, and even photos of former porn star Mia Khalifa and active adult actress Samantha Saint.
The issue came to light after a school worker about the issue on Google's official Search Help Forums, which was by the Search Engine Roundtable site.
"We are a primary school with students aged between 5 years and 12 years old," wrote educator Shona Poppe.
"We had an 8-year-old search 'hot' as a query to translate the word into Māori.
"The child ended up with a complete page of soft pornography, even though we have Safe Search turned on via our filtering service and on our computers.
"Is there some way we can make this more child friendly?"
The Sun was able to confirm the issue, which presented itself even with Safe Search turned on – and in Incognito Mode.
It could be linked to recent changes to Google's Image search system.
The company completely revamped Google Images yesterday, as part of its 20th birthday celebrations.
One Twitter user noted: "Google has been making [changes] to their image search engine.
"Now, when you go to Google Image Search and search for the term 'hot' - the only images that come up are pictures of women in underwear, lingerie, swimsuits etc.
"No photos of hot coffee, desserts, fireplaces. Whoops."
When you search for the term "hot", Google now also offers recommended searches – based on the original term.
The vast majority of these link out to even more inappropriate content.
One prominent recommendation is for "Croatian", which leads you to a page filled with Croatian women in their underwear.
Another is "Disney" – a term that could easily be clicked by a child – which takes you to a page filled with sexualised images of Disney characters.
Google's Safe Search is designed to protect children.
It uses machine learning technology and computer vision to work out what inappropriate pictures look like.
It then applies that to other images, tagging them as inappropriate and hiding them from Safe Search results.
But it's clearly failed in this instance, and could put children at risk.
"We know that innocent online searches can sometimes lead to not-so-innocent results, but it’s concerning if children are stumbling upon semi-nude images even with safe filters switched on," an NSPCC spokesperson told The Sun.
“Parents and schools deserve peace of mind that children are safe to go online and that’s why the Government needs to introduce tough regulation to hold tech giants to account.”
Google Safe Search – how does it work?
Here's what Google has to say...
"You can filter explicit search results on Google, like pornography, with the Safe Search setting."
"Safe Search isn't 100% accurate. But it can help you avoid explicit and inappropriate search results on your phone, tablet, or computer."
"When Safe Search is on, it helps block explicit images, videos, and websites from Google Search results."
"When Safe Search is off, we'll provide the most relevant results for your search and may include explicit content when you search for it."
"We do our best to keep the Safe Search filter as thorough as possible, but sometimes explicit content, like porn or nudity, makes it through."
Worse still, the majority of women depicted in the "hot" search were white, or very light-skinned.
Google has been accused of showing "racist" image results before.
For instance, a TIME article written in Marcy this year complained that Google only showed pictures of white people when you searched for "professor style". This is still largely true today.
MOST READ IN TECH
And in 2016, Twitter uses complained about Google showing mugshots in search results for "three black teenagers", but fun stock images for "three white teenagers".
If you search for the terms "doctor" or "lawyer", you're also presented with pictures of almost exclusively white people.
We've asked Google for comment and will update this story with any response.
Do you think Google needs to do more to protect users? Let us know in the comments!
We pay for your stories! Do you have a story for The Sun Online news team? Email us at [email protected] or call 0207 782 4368 . We pay for videos too. Click here to upload yours.