Facebook. Zuckerberg against the suppression of messages denying the Shoah, the controversy swells



[ad_1]

In the midst of a crisis in content management, spreading false information on Facebook, even leading to riots and violence in some countries, founder Mark Zuckerberg announced that he would not necessarily censor denial messages.

Founding boss of Facebook Mark Zuckerberg was on Thursday in the center of a controversy after announcing that the social network was not going to censor messages denying the existence of the Holocaust.

In an interview granted Wednesday at the site Specialized Recode, he said that Facebook would remove some "fake news" (fake news) likely to lead to violent acts, while stressing that he did not intend to censor remarks that would have , according to him, were held "sincerely" .

"I am Jewish and there are people who deny the existence of the Holoc Auste. I find that very shocking. But at the end of the day, I do not think our platform should remove that kind of comment because I think there are some things that some people are wrong about. ".

In the face of a wave of criticism , Mark Zuckerberg then sent an email to Recode to clarify his statements.

"Of course, if a message crossed the red line advocating violence or hatred against a particular group, it would be removed " he wrote.

Rumors on Facebook at the origin of violence

For several weeks, Facebook is strongly criticized for letting articles, images or videos spread which, without containing a direct appeal to hatred, can be seen as an encouragement to violent actions.

The social network was accused of spreading rumors causing clashes, especially in Burma and S In Sri Lanka, the authorities even blocked access to the site in April, believing that it encouraged inter-religious violence.

Facebook promises to remove false content and photos

Few before the controversy, Facebook had announced that it would remove false information posted on the network and likely to create violence imminently.

"We are starting to implement this new policy in countries where we see examples where misinformation has […] resulted in violence " Tessa Lyons told Facebook, citing the case of Sri Lanka.

For example, the social network may remove inaccurate or misleading content, such as fake photos, created or shared to contribute to or exacerbate physical violence.

Facebook will rely on local organizations or specialized agencies to determine whether these publications are likely to result in imminent violence and therefore need to be withdrawn

[ad_2]
Source link