[ad_1]
The video of the shooting in a synagogue and a Turkish restaurant included a "manifesto" with racist and anti-Semitic comments.
"We have mobilized as quickly as possible to remove this content and we will suspend all accounts publishing or sharing images of this abominable act"said a spokesman for Twitch, pointing out that the company was taking "any act of violence" with "extreme seriousness".
Technology companies were trying to prevent the spread of a bloodbath, as was the case in March in Christchurch, New Zealand, where the attacker broadcasted live on Facebook the murder of his 51 victims.
This has led governments to lobby social networks to prevent the transmission of violent acts on their platforms.
On September 23, Facebook announced additional efforts at the United Nations at a meeting with New Zealand Premier Jacinda Ardern, who has taken up the cause of fighting online extremism.
Also last month, Amazon announced its membership in the Global Internet Forum to Counter Terrorism, an alliance to combat the most dangerous content on social networks.
Twitch, which has attracted many followers of live video game streaming, was acquired by Amazon in 2014 for $ 970 million and has about 15 million users a day.
He explains that the report used by the Halle shooter was created "about two months before the streaming broadcast" and that there was only one live broadcast attempt before Wednesday's attack.
After the Christchurch massacre, Facebook and other networks have highlighted the challenge of preventing violent content sharing, often with minor changes to avoid being detected. by artificial intelligence.
"This video was not included in any recommendation or directory, but our research suggests that people coordinate and share the video through other online messaging services," said Twitch.
Facebook has also recently announced its efforts to work with police in London and elsewhere to obtain data and improve its detection algorithms.
The difficulty comes from the fact that artificial intelligence must be able to tell the difference between a real attack and a movie or video game scene.
"Until now, filtering algorithms did not detect live violence," said Jillian Peterson, professor of criminology at Hamline University, who suggested that social media companies could become "responsible" for their role. in the dissemination of violent content and hatred.
Peterson's research and others suggest that potential shooters may be affected by contagion when they see similar attacks.
"In many ways, these shootings are performances, meant for everyone to see," Peterson said.
Hans-Jakob Schindler of the Project to Combat Extremism, a group that seeks to end online violence, said the latest live broadcast underscored the need for more aggressive action against social platforms.
"Online platforms need to take initiatives and prevent the manipulation of their services and, in turn, the parent companies must hold them accountable," said Schindler.
"This tragic incident demonstrates once again that an approach to self-regulation is not effective enough and unfortunately emphasizes the need for stricter regulation of the technology sector."
.
[ad_2]
Source link