Why Facebook could not interrupt the life of the attack in New Zealand? | Chronic



[ad_1]

Social networks allow their users to share in real time pleasant, funny or emotional moments of their life. Sometimes, involuntarily, they capture criminal acts. Others, on the other hand, are used deliberately by those who seek to harm and whose actions affect millions of people around the world.

This was the case Brenton Tarrant, 28-year-old Australian who broadcasted live the mbadacre he committed at the Christchurch mosque in New Zealand. An attack to leave in real time, its invasion of hatred and xenophobia. The show lasted 17 minutes and did not have any type of filter blocking its diffusion via the social network of Mark Zuckerberg.

In that time frame, as though it 's been a Counter Strike game, the gun is shooting and killing people. The user carries several weapons, including a shotgun and a semi-automatic rifle. What caught my eye was the way nobody cut this raw sample of live life.

Facebook's security policies are strict, but they have flaws. Every day, photos, videos, profiles and groups posted by other users or an algorithm based on artificial intelligence are badyzed and Facebook filters transmitted.

Subsequently, a team of 20,000 moderators is responsible for ensuring compliance with safety standards. About 7,500 of them are content reviewers who rate all the publications reported by users in more than 50 languages ​​24 hours a day.

However, the mbadacre at the Christchurch Mosque overcame all these barriers. Once the problem was reported by the New Zealand police, it was already late: the video had been seen by supporters of Tarrant. Therefore, from the applications, they downloaded the remaining video to be able to read it again.

The competent authorities have not only removed the video, but also the terrorist accounts on Facebook and Instagram. The same thing happened on YouTube and Twitter. "We are also eliminating any praise or support for the crime and the shooter or probably the shooters as soon as we know it, we will continue to work directly with the New Zealand police as long as his response and his investigation continue"said the Facebook official in Australia and New Zealand, Mia Garlickin a statement.

For its part, Twitter has announced that it has suspended a shooting-related account and use to remove the video from its platform, according to statements by a company spokesperson with CNN. YouTube, owned by Google, has been removed from its video platform "Shocking, violent and graphic content" as soon as you are informed.

"We eliminate any praise or support for the crime and the shooter or probably the shooters"

It is clear that the controls imposed by social networks, at least at present, barely manage to stop the dissemination of images or videos fomenting violence, containing explicit bad or messages with abusive content. The challenge will be to control these live shows on Facebook and YouTube, as well as the rest of the platforms on which this feature is available.

.

[ad_2]
Source link