Facebook’s naivety over political content ignores bigger assaults on democracy
Facebook has 2.6 billion users, give or take, which, if they all lived in the same country would make it the biggest nation on earth.
So, in some ways, it’s not a big surprise that the governance of Facebook judders along with the oddities of a rather badly run country.
It’s an organisation that runs itself with the same tech lag you’d expect from a sluggish, analogue government.
Constant catching up with the habits of tech users such that when you make new rules, those people have already moved on, and you’re behind again.
You’d think they’d know better, yet it’s not a big surprise that Sir Nick Clegg is involved.
He is, of course, the very personification of political honesty and transparency and now a Silicon Valley Titan at Facebook.
This week led the announcement on how the social media giant was going to deal with the issue of political interference on the platform ahead of the
November US elections.
Forgive us for being underwhelmed.
For one thing, decisions made under pressure and in haste at a corporate HQ are rarely the smartest.
The move was designed to be seen as constructive positioning ahead of the US election campaign, even though it is, to all intents and purposes, already underway.
And it’s an urgent and speedy reaction to the issues of the last election in… erm… 2016.
So it’s hard not to speculate that the timing and content was triggered by an open letter at the start of the month from dozens of former Facebook employees for the way the company (hasn’t) handled political inaccuracies on it’s platforms.
“Facebook’s leadership must reconsider their policies regarding political speech, beginning by fact-checking politicians and explicitly labeling harmful posts,” the letter suggested.
It wasn’t all bad as Clegg did the rounds – promising that his employers will put even greater efforts into its already effective tools to drive voter registration now and voter turnout during the election itself.
That’s a good thing.
The second part of the company’s answer was to allow users to turn off all news on social issues and electoral and political ads – removing them from all Facebook products, including Instagram.
In other words, place the emphasis on the user not the platform (when was the last time you turned off any sort of content feed on there?) and to continue to take the money for the ads in the first place.
They will show who paid them for a political campaign, and will begin
tracking ad spend on a candidate-by-candidate basis, to “help you understand how much advertisers and candidates are spending to reach voters”.
Which is all well and good, but rather sidesteps the problem.
In a perfect demonstration of that tech lag, the announcement came in the same week that the US research company, Graphika, ‘revealed’,
This operation is essentially a massive misinformation campaign with “high operational security abilities, deep resources and strategic patience”.
The campaign was a six-year, Russian-coordinated, attempt to create tensions between European countries and the US, and was delivered in seven languages on 300 different online platforms – from Facebook, YouTube and Twitter down to fringe blogging sites.
And, famously, on Reddit, where Jeremy Corbyn was dragged, rather willingly, in.
It was big and it was effective (have you noticed any political chaos recently? A teensy bit?) and it has absolutely nothing to do with verified ads and transparent funding.
Facebook’s willingness to target the visible problem is fine – Twitter have done the same with their half-hearted fact-checking on Donald Trump – but it’s like tackling an ice cube and ignoring the iceberg.
Effective disinformation doesn’t come from the top down, not from ads by recognisable groups. It comes from the ground up.
Disinformation and conspiracy theories on small blogs and sites, which are then artificially inflated by coordinated and well-resourced campaigns (the sort available to a large country at the far end of Europe).
That’s the kind of disinformation that looks legitimate and ‘organic’ – and can be effectively targeted to groups and demographics who are willing to accept such ‘facts’.
While Secondary Infektion might have finished, it would be extraordinarily naive to think that the skills and resources which went into it would not be redeployed in the forthcoming election in the most powerful nation on earth.
Clegg asserted that Facebook have learned a lot since to 2016 Presidential elections and the 200 elections across the world since then, and that their moderation teams have been significantly boosted.
But they do seem seem to have rather missed the point – and ignored the fact that significant numbers of those moderators wanted to walk out of the company over Black Lives Matter only last week.
Social media will be a major battleground for both parties in the US elections, but also for external, malicious agencies.
Nothing that has been announced this week indicates anything other than at some point in 2021, Clegg will be talking about ‘the lessons we have learned’ from 2020’s forthcoming assault on democracy.