<https://www.theguardian.com/media/commentisfree/2019/jun/02/social-platforms-facebook-debate-regulation>
[...]
Facebook had fielded Monika Bickert to answer questions from Anderson Cooper on CNN about the video. Bickert is Facebook’s most thoughtful and knowledgeable executive, heading its product policy and counter-terrorism. Her talking points were very familiar: Facebook’s approach is to flag when something is false, to reduce its reach through algorithmic throttling, but not to remove material altogether unless immediate harm flows from it. Hence terrorism and scientifically inaccurate claims about vaccinations are gone, but Holocaust denial and a clearly misleading clip that affects the democratic process are not. “We are not in the news business,” Bickert told Anderson Cooper. “We are in the social media business.”
Engineers and lawyers in Silicon Valley are often mocked for their lack of facility in communications, but over the years they have been remarkably effective in constructing a whole language which casts a gauzy blanket of euphemism over a heap of ugly truths. They have eradicated all meaning and distinction around dramatically different social, cultural and political interactions by labelling them “content”, they have made the diseased metaphor of “virality” a virtue, and rebranded sales people “influencers”. Even the phrase “social media” is a positive denominator which hides a multitude of sins.
The business Facebook is in is advertising (to be clear), and the “social media business” is not only the news business (increasingly), it is also the politics business, the public health business, the terrorism business, the education business, the everything in bytes business.
In fact, the approach that Bickert described of flagging material as false, and reducing the reach of it, telling people who have seen it that it is wrong, is not a bad strategy and it exceeds the measures taken by many media outlets with their own material who would readily criticise Facebook. It is a necessary measure but not sufficient. In other matters, Facebook is on shaky ground. Its stated policy from its founder, Mark Zuckerberg, is that people should decide what they believe for themselves up to and including Holocaust denial, which is arguably incompatible with its position on removing material which creates the potential of immediate harm.
These are wide philosophical questions that cannot be used as the basis of rules that robots execute. There is not yet a computational way to correctly decide one slowed-down video is an attack on democracy and another is a piece of satire. What Facebook proposes, in one dimension, is the assembly of an independent content body, which will advise on such matters. I am pretty sure that there would be a split in how any diverse council would come to a conclusion on Pelosi: editors, whose job it is essentially to safeguard the publishing of very little material and stop incorrect information reaching the public domain, would say take it down. Those who think in a more long-term or philosophical dimension about free speech would say keep it up.
Facebook, Twitter, YouTube and whoever else outsources decision-making on these issues will eventually have to internalise them, too, and have a recognisable culture, set of rules, appeals process and communications system which is capable of responding quickly to information crises. The disruption that minor calls about malicious material creates are too much for any corporate system to withstand, unless it has been designed to deal with them. The inevitable outcome will be a narrowing of the type of material promoted and circulated.
The platforms are no longer innovators in terms of speech, publication and the public sphere; they are incumbents, gatekeepers, publishers, however you want to describe them. In other words they are done with disruption, and will therefore inevitably become perpetuators of incumbency themselves. As the experiment of levelling access to tools and speech comes to a shuddering halt, the platforms will increasingly favour the establishment, whether that is among a narrow band of mainstream media suppliers, a smaller number of high-follower influencers, a tiered system of political actors or a particular class of advertisers.
[...]