Facebook: It seems that even she does not know how her moderation system works | Trade | Technology and science | Social networks



[ad_1]

Facebook has been criticized for the way it handles the pages of its platform that are nurtured by speculation and misinformation, such as the Inforwars conspiracy theories website and others that spread false news. But even with the action of punishing the creator of it, Alex Jones, the social network is not saved from pointing. According to the site "TechCrunch", it's like he did not even know how his content moderation system works.

The company has removed four videos present in Jones' Facebook profile and the Infowars page by violating your community rules. This is the first movement of the company to stop the scope of this director, responsible for the dissemination of "alternative theories" (not to say deliberately foolish and malicious) on events such as September 11 or the shooting of San Bernardino

. the action is conducive to the discussion in favor of the social network. But during the first six hours of what happened, it's as if Facebook was not aware of it. The information was published by the website "Cnet": Jones was suspended for 30 days for publishing four inappropriate videos for the platform. But when "TechCrunch" contacted the social network, he learned that Jones "only" received a warning and that only a second warning would cause the suspension.

After several hours and e-mail exchange, Facebook ratified what was broadcast by "Cnet": Jones is suspended and Infowars only warned. In short, the company has eliminated the message, but it does not get rid of the messenger.

The initial inconsistency was due to the fact that Jones' account had been warned in the past, and that's why he was suspended for 30 days, according to he said Facebook . But that's the first time that Inforowars comes across a fault. But the importance of the issue goes beyond this decision.

The suspension of Jones is an important precedent. In addition to Infowars, there are many sites in Facebook and on the Internet that spread misinformation, play with insecurities, nationalist pride and misinterpretations to take advantage of it. But they, for more misinformation than they sell, no matter how infamous they spread in their opinions, deserve to have a voice within the [[http://www.wikipedia.org/] Facebook according to the same social network.

Yes, they have the right to vote, but as long as they release these messages in the rules of the community of Facebook . And, apparently, also when the social network does not feel more than forced to act.

"TechCrunch" suggests that if Jones' videos have not been removed by YouTube, a rival platform of Facebook would have remained in the social network. Indeed, even one of the eliminated pieces had been approved as an act by the content moderation system of the social site – something that now qualified as error and which might not have been could be rectified without the measure of Google's video platform. 19659009] Finally, Facebook can not argue solidly that such an important announcement raises mistrust as to how it organizes its teams when it comes to moderating a video.

[ad_2]
Source link