Long read sul New Yorker:
"There are reportedly more than five hundred full-time employees
working in Facebook’s P.R. department. These days, their primary job is
to insist that Facebook is a fun place to share baby photos and sell old
couches, not a vector for hate speech, misinformation, and violent
extremist propaganda. In July,
Nick Clegg,
a former Deputy Prime Minister of the U.K. who is now a top flack at
Facebook, published a piece on AdAge.com and on the company’s official
blog titled “Facebook Does Not Benefit from Hate,” in which he wrote,
“There is no incentive for us to do anything but remove it.” The
previous week, Guy Rosen, whose job title is vice-president for
integrity, had written, “We don’t allow hate speech on Facebook. While
we recognize we have more to do . . . we are moving in the right
direction.”
(...)
It would be more accurate to say that the company is moving in several
contradictory directions at once. In theory, no one is allowed to post
hate speech on Facebook. Yet many world leaders—
Rodrigo Duterte, of the Philippines;
Narendra Modi, of India;
Donald Trump;
and others—routinely spread hate speech and disinformation, on Facebook
and elsewhere. The company could apply the same standards to demagogues
as it does to everyone else, banning them from the platform when
necessary, but this would be financially risky. (If Facebook were to ban
Trump, he would surely try to retaliate with onerous regulations; he
might also encourage his supporters to boycott the company.) Instead,
again and again, Facebook has erred on the side of allowing politicians
to post whatever they want, even when this has led the company to weaken
its own rules, to apply them selectively, to creatively reinterpret
them, or to ignore them altogether.
(...)
In retrospect, it seems that the company’s strategy has never been to
manage the problem of dangerous content, but rather to manage the
public’s perception of the problem. In Clegg’s recent blog post, he
wrote that Facebook takes a “zero tolerance approach” to hate speech,
but that, “with so much content posted every day, rooting out the hate
is like looking for a needle in a haystack.” This metaphor casts
Zuckerberg as a hapless victim of fate: day after day, through no fault
of his own, his haystack ends up mysteriously full of needles. A more
honest metaphor would posit a powerful set of magnets at the center of
the haystack—Facebook’s algorithms, which attract and elevate whatever
content is most highly charged. If there are needles anywhere
nearby—and, on the Internet, there always are—the magnets will pull them
in. Remove as many as you want today; more will reappear tomorrow. This
is how the system is designed to work.
Ciao,
F.
--
http://www.forbes.com/sites/federicoguerrini/
http://reutersinstitute.politics.ox.ac.uk/publication/newsroom-curators-and-independent-storytellers
My latest book: Content Curation (Italian)
http://www.amazon.it/Content-Curation-Federico-Guerrini/dp/8820366126