Google employs ZERO staff in the UK to monitor hate-filled content which spreads across the web
Senior MPs said the web giant's response was 'not good enough' as it emerged only 200 Google staff work on removing hate posts
GOOGLE was blasted tonight after revealing it has just 200 directly employed staff monitoring content on YouTube – and none in the UK.
The cross-party Home Affairs Select Committee said the staggering figures helped to explain why so many hate-filled videos were still online.
Labour chair Yvette Cooper demanded the web giant detail staffing numbers when demanding to know why it had taken so long to remove propaganda posted by far-right extremists National Action.
In a reply released today, Nicklas Lundblad, vice-president for public policy in Europe, Middle East and Africa, said there were 200 Google staff across the US, Ireland, Singapore and India.
He insisted a further 4,000 agency staff “worked on content moderation” for the company – based with outsourcing giants such as Accenture and Concentrix in countries from Poland to the Philippines.
But Ms Cooper stormed: “Google’s response just isn’t good enough. This incredibly rich and powerful global company has a huge responsibility to stop its platforms being used for crime, extremism and damage to young people.
“Yet in most cases it doesn’t even employ its own staff to work on tackling illegal or abusive content, it contracts the problem out to others instead.”
Ms Cooper demanded answers in March when YouTube failed to remove four propaganda videos posted by National Action.
Google’s counter-terror chief William McCants at the time said the videos had been flagged but its review team had wrongly decided not to remove them.
But the committee chair noted that MPs had reported the same 2016 National Action speech eight times to YouTube over the past year – after spotting that it had been reposted.
In the letter published yesterday, Mr Lundblad said four “reviewers” had been sent on training courses.
“With an open platform such as YouTube, we can never promise that content which violates our policies or is illegal will not be uploaded.
“But our systems have delivered demonstrable progress on getting violative videos down quickly, often upon upload or before they’ve received many views.”
But in a statement Ms Cooper last night said: “We raised those illegal videos repeatedly over twelve months with Google and YouTube top executives, yet we still found them on the platform.
“Google have already admitted to us that their content moderators weren’t sufficient sensitive to far right extremism and terror threats in the UK.
“Now we learn why, if none of them are based here.”