[ad_1]
The report highlights another case where Facebook is struggling to handle the rapid spread of hatred and misinformation on any of its platforms – and the impact of these failures on society.
"Facebook has become a means for those seeking to spread hatred and harm," the report said, noting that the publications on the platform were related to violence in Myanmar.
BSR gave a long list of recommendations that Facebook should implement. The nonprofit organization warned that the legislative elections of Myanmar in 2020 "will likely constitute a hot spot for hate speech, harassment, misinformation, incitement to violence and d & # 39; other actions designed to undermine the political process ".
"Facebook would be well served by preparing for multiple contingencies now," he adds.
Warofka acknowledged that Facebook's platform was operating in Myanmar, but he underscored BSR's conclusion that Facebook "alone can not make the big changes needed to address the human rights situation." ".
The report notes that the risks of human rights abuses related to Facebook are greatly increased due to the absence of rule of law and cultural beliefs in Myanmar and the recent emergence of decades of regime. authoritarian in which the freedom of expression was restricted.
Can Facebook repair itself?
The company has "heavily invested in human resources, technology and partnerships to examine and combat Facebook abuses in Myanmar," said Warofka, adding that she had "made progress in implementing many recommendations" .
The report calls for improving the application of Facebook community standards and to strengthen engagement with key organizations in Myanmar, including the government and civil rights groups.
Facebook also needs to increase transparency by preserving and publishing Myanmar-specific data so that the local and international community can assess progress more effectively, the report says.
Chief Executive Mark Zuckerberg promised in April to do more to help combat hate speech in the country after activists accused him of turning a blind eye.
In response to an open letter from a group of technology and non-profit organizations, Zuckerberg said the social media giant would introduce technological improvements to filter hate content.
"We now have a special product team that works to better understand specific local challenges and create the right tools to keep people safe," Zuckerberg said at the time.