[ad_1]
When it was learned that the New Zealand mbadacre had been broadcast live on Facebook, I immediately thought of Robert Godwin Sr. In 2017, Godwin was murdered in Cleveland, Ohio. time a relatively new feature of the social network. Facebook then clarified that the graphic video had been uploaded after the event, but the incident drew public attention to the risks of live violence.
As a result of the murder of Godwin, I recommended postponing the broadcast of the broadcasts on Facebook Live, at least for Facebook users who had told the company that they were under 18 years old. This will allow adult users to report inappropriate content before children are exposed. to that. Facebook Live has disseminated information about murders, as well as other serious crimes such as badual badault, torture and child abuse. Although the company has hired over 3,000 additional human content moderators, Facebook does not better prevent the appalling violence from streaming live online without any filters or warnings for users.
In the 24 hours that followed the New Zealand mbadacre, 1.5 million videos and images of the mbadacres were uploaded to Facebook's servers, the company said. Facebook pointed out the fact that 1.2 million of them "were blocked during the download". However, as a researcher and educator on social media, I've heard that 300,000 videos and images of a mbad murder went through his automated systems and were visible on the network. the platform.
The company recently released badytical information and noted that fewer than 200 people had viewed the mbadacre live and that, surprisingly, no user had reported it to Facebook before the end of the mbadacre. These details make it very clear how Facebook relies on users to report harmful content. They also suggest that people do not know how to report inappropriate content – or do not trust that the company will respond to the complaint.
The remaining video after the end of the live stream was viewed nearly 4,000 times – which does not include copies of the video uploaded to other sites and to Facebook by other users . It is not known how many people who saw it were minors; 13-year-olds are allowed to create a Facebook account and could have encountered unfiltered sequences of deadly hate. It is high time for society to respond and fulfill the promise made by its founder and CEO, Mark Zuckerberg, two years ago after the murder of Godwin: "We will do everything in our power to prevent such tragedies . "
A simple delay
In the television sector, delays of a few seconds are typical when broadcasting live events. This time allows a moderator to review the content and confirm that it is suitable for a wide audience.
Facebook relies on users as moderators, and some live streams may not have a wide audience like television. Its delay should be longer, perhaps a few minutes. It was only then that enough adult adults would have examined it and would have had the opportunity to report the contents. Key users, including publishers and companies, could be allowed to broadcast live directly after taking a training course. Facebook could even allow people to ask a moderator of the company for future live feeds.
Facebook has not yet taken this relatively simple step – and the reason is clear. Delays were noted on television only because broadcasters were sanctioned by broadcasters for broadcasting inappropriate content during live broadcasts. In reality, there is no regulation for social media companies; they only change in order to make profits or to minimize public protests.
The question of whether and how to regulate social media is a political issue, but many US politicians have developed deep links with platforms such as Facebook. Some have used social media to collect donations, target supporters with advertising and help them get elected. Once in office, they continue to use social media to communicate with their supporters in the hope of being re-elected.
Federal agencies also use social media to communicate with the public and influence their opinions, even in violation of US law. In my opinion, the role of Facebook as a tool to obtain, preserve and disseminate political power considerably reduces the chances of politicians to control it.
US regulation is not coming soon
Congress has yet taken any significant steps to regulate social media companies. Despite strong statements by politicians and even calls for social media hearings in response to the New Zealand attack, US regulators are not likely to lead the way.
EU officials deal with a lot of the work, especially with regard to privacy. The New Zealand government has also decided to ban the live broadcast of the mosque mbadacre, which means anyone who shares it could face a fine of 10,000 NZD and 14 years in prison. At least two people have already been arrested for sharing it online.
Facebook could – and should – act now
Much of the discussion on social media regulation has looked at antitrust and monopoly laws to force huge tech giants like Facebook to split into separate small companies. But if that happened, it would be very difficult – breaking with AT & T lasted a decade, from the 1974 trial to the launch in 1984 of the Baby Bell companies.
In the meantime, there will be many more dangerous and violent incidents that people will try to live on. Facebook should evaluate the potential for misuse of its products and stop them if their effects are harmful to society.
No child should ever see the type of "raw and visceral content" produced on Facebook Live – including mbad murders. Nor do I think that adult users should be witnesses to such heinous acts, as studies have shown that the visualization of graphic violence poses health risks, such as post-traumatic stress.
That's why I no longer recommend a live posting deadline for teens, it's a call for child protection, while it is unlikely that major changes will be made to the platform. form. But everyone deserves improved and secure social media. I now ask Mark Zuckerberg to close Facebook Live in the interest of public health and safety. In my opinion, this feature should only be restored if the company can prove to the public – and regulators – that its design is safer.
Managing live streaming securely involves having more than enough professional content moderators to handle the workload. These workers must also have appropriate access to mental health support and safe work environments, so that even Facebook employees and Facebook subcontractors are not unduly marked by the brutal violence posted online.
# This article first appeared on The Conversation. The original report can be viewed here
Source link