[ad_1]
Facebook founder and CEO Mark Zuckerberg will speak to the participants at the Viva Technologie trade show at the Porte de Versailles Exhibition Center on May 24, 2018 in Paris, France.
Chesnot | Getty Images
In a new study describing how hate is spreading in the online world, researchers have studied how hate groups thrive on social media even when they are banned and are offered new solutions to dismantle them .
The study, published Wednesday in the journal "Nature", describes a phenomenon that lawmakers and social media companies are struggling to understand and contain. Shortly after an armed man massacred more than 50 people in two mosques in Christchurch, New Zealand, copies of his first-person video multiplied on social media. New videos have appeared on Facebook, Twitter, YouTube and Reddit. It is also known that extreme groups migrate from classic platforms to places such as 8chan with less moderation standards in case of content deletion.
The researchers, made up of a multidisciplinary team from George Washington University and the University of Miami, identified groups they called "hate groups" on Facebook and its counterpart in Central Europe, VKontakte. The researchers traced the path between these groups and adjacent hate groups to which the users were explicitly linked. The researchers chose to focus on right-wing hate, claiming that it is prevalent in the world and related to recent violence in the real world, but that the method can be replicated for any type of hate group.
Researchers led by Neil Johnson, professor of physics at George Washington University, have mapped how "hate clusters" move across platforms and regions.
Courtesy of Neil Johnson
The study revealed that hate groups often regenerate and spread on all platforms and in the world, even when they are banned.
For example, after Facebook banned KKK, nearly 60 KKK clusters continued to exist on VKontakte, according to researchers. But after the Ukrainian government banned VKontakte, the clusters were "reincarnated" on Facebook with "KuKluxKlan" written in Cyrillic, which, according to the authors, made it more difficult to capture algorithms in English.
The researchers have proposed a series of actions that social media companies can undertake to end hate groups. The team is now working independently with an unidentified social media network on the subject, according to Neil Johnson, professor of physics at George Washington University and lead author of the study. Researchers are also developing software that governments and regulators can use to identify hate clusters.
The proposed solutions aim to eliminate the weakest players in the ecosystem and undermine clusters of hate from within. Johnson and his team suggest that instead of attacking a very vocal and powerful player, social media platforms remove smaller clusters and randomly delete individual members. Removing researchers from only 10% of members of a hate group would cause it to collapse.
"The bigger ones have power, they have people, they have money, so they will turn around and sue if they are attacked," Johnson said. "So, take care of the little ones, because they are the ones who will integrate in a few years the greatest ones."
The team also proposed that users engage members of hate groups in an organic way and exploit philosophical divisions between adjacent hate groups. Two groups of white supremacists that they studied were, for example, very divided on the question of whether Europe should be unified.
Johnson stated that his knowledge of physics had forced him to comprehensively address the question of how hatred spread. He compared the spread of hate online and how it treats violent abusers to the process of boiling water.
"All molecules are equally good and bad," he said. "Bubbles create energy locally and some create molecules that have a lot more energy than others locally, and these are the ones that pop up above the water. is exactly what is happening. "
WATCH: Why is Facebook's business model starting to be criticized?
[ad_2]
Source link