The disturbing role of Dublin by turning a blind eye to hate speech



[ad_1]

According to its founder, Mark Zuckerberg, Facebook has an ambitious mission: to connect the world.

His business model, however, is a bit ruder linking the world to advertisers. This is achieved with a largely libertarian policy, only moderately regulated, towards the content that this world chooses to share, repressing images of bads, say, while leaving hate speech largely uncontrolled. Even for the casual observer, this has long seemed less like consistent service terms than the peculiarities of a failed algorithm, which is also a polite way of describing Mark Zuckerberg.

Facebook content scrubbers have separated, using fragile fascinating justifications

In reality, Facebook uses people to review any content that its users report as unacceptable, and provides them with a policy to help decide to ignore, delete, or mark it as disturbing – thus preceding it. of a warning. This latter option is abbreviated as MAD, which makes it more strange to hear a distorted but unmistakable Dublin voice referring to the "MAD stuff" – a category that may include a video of a man eating baby rats. alive, or another man violently beating a

Why such material is not automatically removed, it is requested in Inside Facebook: Secrets of the social network (Channel 4, Tuesday, 21 hours) an undercover presentation of the Facebook content review department, based in Dublin. "For a better user experience," he answers in numb technical jargon. Forget the defense of Nuremberg. They only follow politics.

With secret cameras deployed in a corporate training module, Facebook's content scrubbers broke up to find out if the material could be considered offensive, using fascinating and fragile justifications. A video of a brutal beating, for example, can be ignored if it has a caption condemning the beating – for how can anyone extract the pleasure of something that has been publicly lamented? ?

extremes are the most valuable, "says Roger McNamee, an early investor now skeptical. Disturbing content is more likely to be shared, either in support or distress, thus reaching more eyes, which in turn meet more publicity.

If Facebook is serious about the experience, it seems strange to provide an "armored" status to extremely popular users

Richard Allen, vice president of Facebook's global politics, protests against the fact that extreme content "is not the experience we are trying to provide", and no one would doubt him, but the documentary illustrates the serious consequences of this radically desensitized approach. 19659002] This video of the physically abused two-year-old child, for example, is still online six years later, under the defense of "Spreading Conscience." (A psychologist categorically rejects the same argument when it is used to leave images of self-harm online). The answer from RSPCC is more irrefutable: "This child is reabouted with every click."

If Facebook is serious about "the experience", it seems odd to provide an "armored" status to extremely popular users, such as the notoriously racist British first that had been around. a million fans before being removed from the platform. "Obviously, they had a lot of followers, so they generated a lot of revenue for Facebook," Dublin says again.

Facebook should mark these words, and this program disturbing, as disturbing – the crazy stuff born not of a loosely defined concept of freedom of speech, but of a culture of enterprise without courage that closes the eyes on the darkest parts of the world that he connects.

Do not like it.

[ad_2]
Source link