As Facebook discusses Myanmar's role in violence, think back to the first warnings | The dilemma of Facebook | FRONTLINE | PBS


[ad_1]

Signs are posted in front of Facebook Inc.'s headquarters in Menlo Park, California, United States, on Tuesday, October 30, 2018. David Paul Morris / Bloomberg via Getty Images

On the eve of the mid-term US elections, Facebook has released an external report on the impact on human rights in Myanmar, where the country's Rohingya Muslim minority has been the subject of brutal violence that United Nations described as genocide.

The UN said social media – and Facebook in particular – was an important factor, as the platform allowed the spread of hate speech and calls for violence against the Rohingya across Myanmar. Facebook had admitted that he had been slow to respond to concerns. But this report provides a clearer picture of the impact of society on the ground.

The report, written by Business for Social Responsibility, revealed that Facebook was "directly linked" to the harm done to Myanmar when people used the platform in a manner that violated the norms of its community, for example incite violence, spread misinformation or promote hate speech. Although she stated that the company had not caused or contributed to human rights violations "by her own actions," the evaluation found that Facebook's platform had been "useful" for those who sought to cause real harm to Myanmar, and she pointed out recommendations to the company to solve the problem. The report also warned that the prospect of the 2020 elections in the country could present new risks.

In the past year, Facebook has reportedly removed problematic accounts in Myanmar, recruited more language experts and improved its policies. "We agree that we can and must do more," said Alex Warofka, product policy manager on Facebook, in an article on Facebook's blog. Warofka described Facebook's efforts to address five "continuous improvement" areas identified by BSR. He also noted the conclusion of the report that "Facebook alone can not make the big changes needed to address the human rights situation in Myanmar".

But Myanmar and other countries were warned in advance on Facebook to indicate how it was used to shape events on the ground. Last week, in The dilemma of Facebook, FRONTLINE explored how the company responded to warnings about the platform's role in spreading misinformation and hate speech, including triggering real-world violence in Myanmar. In the following excerpt from the documentary, David Madden, a Myanmar-based technical contractor, reports that he made a presentation at Facebook's headquarters in May 2015, warning that the Muslim minority in Myanmar was the target of hate speech.

"I did the analogy with what had happened in Rwanda, where radios had played a really key role in the execution of this genocide," Madden told FRONTLINE. "That's why I said," Facebook runs the risk of being in Myanmar what radio is in Rwanda "- that this platform can be used to foment hatred and incite violence."

Madden stated that she received an email stating that her concerns had been shared internally and taken "very seriously". But the violence has intensified. As the film reports, Madden and other local activists had another meeting with Facebook in early 2017, warning that the platform's processes for dealing with content that demonized Muslims in the country were not working.

"I think, I think, that Facebook's main response has been:" We're going to have to go away, dig deeper and come back with something concrete, "Madden told FRONTLINE." The thing was, it's n & # 39; Never came. "

Watch the two-part FRONTLINE survey on Facebook here.

[ad_2]Source link