[ad_1]
Facebook struggled with its role in spreading false news and misinformation for a few years, but a wave of collective violence in India, Sri Lanka, and Myanmar pushed the social network into an instinctive policy shift .
Up to now, Facebook has treated misinformation by making it less important in people's news feeds. This week, the company announced that it would begin to suppress inaccurate or misleading information created or shared "for the purpose of contributing to or exacerbating violence or physical harm."
At first glance, this seems to be an intentional policy. However, the lightest questioning reveals an incredibly complex and thankless task.
In addition, any success will be undermined by the fact that much of the incendiary misinformation in Southeast Asia is broadcast by Facebook's sister platform WhatsApp, where encryption
Policy Change will first be implemented in Sri Lanka, where pernicious lies on the platform, such as the allegation that Muslims put sterilization pills in food for the Sinhalese majority, have fueled the riots, the destruction of mosques and businesses owned by Muslims. The Sri Lankan government temporarily blocked Facebook services in March to defuse the situation.
Facebook said it was working with local civil society groups to identify what content could contribute to physical damage. Once the company has verified that the information is false and could be a contributing factor to "imminent" violence or physical security, Facebook will remove it.
Last month, the company said it was removing content poisoning food. The company would not reveal the exact content it had deleted, nor the names of the civil society groups it was working with. A representative of the Center for Policy Alternatives, one of the most vocal groups of civil society in Sri Lanka, said: "This is not something we have been told about."
The announcement of the policy appears to have been broadcast to provide "news" to dozens of non-US reporters that Facebook had transported from Europe, Asia and Latin America for a press day. event at the headquarters of the company Menlo Park Wednesday.
This could explain why Facebook representatives were not able to answer questions about the specifics of the policy, which the company plans to roll out over the next few months. What is the threshold of violence? A punch? Arson? A lynching? Will he retroactively remove hoaxes like Pizzagate that bubble up for months before someone fires a gun in a pizzeria? Will Facebook go back to civil society groups from all sides? If so, what will it do if there is no consensus?
Beyond the practicalities of implementing the policy, one of the most glaring challenges will be how Facebook will know if its actions actually contribute to mitigating violence. "You will never know the harm you avoid," said Joan Donovan of Data and Society. "It's an immeasurable and unmeasurable victory."
Yet, even in its messy form, the update was well received by those who became frustrated with Facebook's inaction on behalf of freedom of expression
said Claire Wardle, researcher at the Shorenstein Center for Media, Politics and Public Policy at Harvard, specializing in disseminating misinformation. "It has been a wake-up call over the past six months to see how the rumors are spreading in real violence."
"That's not to say it's going to be easy, I hope they work closely with local civil society groups and they hire moderation staff from people speaking local languages, "she added.
Civil society groups can identify content that may incite violence. access to the social graph of Facebook.
"Facebook knows the ways in which this misinformation is moving," Donovan said. "He knows that there are hubs and shelves, he has to use his own data and invest in groups that can help him understand the context of his data to spot the actual manipulations and thwart those accounts. . "
Even though Facebook conceals misinformation on its main platform, in his hands with WhatsApp, where most of the most dangerous rumors in Southeast Asia unfold. WhatsApp messages have end-to-end encryption, which means that Facebook can not see or moderate their content.
In places like Sri Lanka and India, people use the messaging application differently from the typical American. Users will join groups of more than 100 participants (WhatsApp groups with a maximum of 256 members), used to broadcast and discuss local issues – and misinformation.
"The networks are like a honeymoon, with political agents with 10-20 telephones networked in all the different groups.They will light a fire in each of these groups, then other political actors will transmit the message to other groups, "Donovan said.
"Because the change is not algorithmic but human, it means that Facebook is trying to" Pankaj Jain, who runs the Indian verification website SM Hoax Slayer dedicated to demystifying fake news online, said that WhatsApp was "obviously" worse than Facebook for spreading misinformation. He said that was partly because the messaging application was so easy to use and had the widest reach among rural communities, and partly because the data charges were so low.
His third reason concerns a fundamental function of confidentiality. "People who create and distribute fake news are aware that they can not be followed, so WhatsApp is their first choice for fake news."
Harssh Poddar, a senior police officer in the city of Malegaon in the Indian state of Maharashtra. treated the crowd violence triggered by an unfounded fear of child abductions, agree. But he noted that Facebook's new policy could be helpful in removing some of the same videos and sources that are shared via WhatsApp, smothering them in the bud.
"It would make a difference insofar as it might appease some of the Poddar, who ran his own training on media literacy in Malegaon, would like Facebook to be more responsive to the demands of the forces of the world. Local order to help identify the sources from where these falsified videos or fake news are spreading.WhatsApp spokesperson Carl Woog, speaking during the media event on Facebook, has Called the violence in Sri Lanka and India "horrible".
"It was really terrible to watch and our hearts were broken We saw advertisements in the Indian media and worked with groups of civil society to organize training sessions on misinformation and digital culture. The application has also introduced a new tag for forwarded messages that emphasizes that an item of content was not originally composed by the sender.
"We take this very seriously," said Woog. "But there are limits."
Without access to the content of WhatsApp messages, Facebook must focus on metadata, including phone IDs, IP addresses, and how messages circulate among members of different groups.
"We will see a lot of moves [from Facebook]," Wardle said, "as they realized that WhatsApp is their Achilles heel."
Source link