UK announces intention to hold social media managers accountable for harmful content



[ad_1]

LONDON – The UK government announced on Monday that it was considering taking personal responsibility of social media managers for harmful content posted on their platforms, in a series of new proposals for online security.

The plans unveiled in a policy document, which also includes the creation of an independent regulator, aim to combat all kinds of harmful content, ranging from the promotion of violence and suicide to the spread of misinformation and cyber -harassment.

The issue has become increasingly urgent with Facebook 's failure to immediately stop the flow of data from a March 15 attack by a self – proclaimed white supremacist on two mosques in New Zealand that have killed 50 people.

Prime Minister Theresa May warned technology companies that they "have not done enough" to protect users and that her government intended to "impose" a duty legal vigilance "to companies" to ensure the safety of people ".

"For too long, these companies have not done enough to protect users, especially children and young people, from harmful content," she said in a statement.

"It's not enough, and it's time to do things differently.

"Online businesses must begin to take responsibility for their platforms and help restore public confidence in this technology."

The proposed new laws will apply to any business that allows users to share or discover user-generated content or interact online.

This will include file hosting sites and discussion boards, as well as the most popular social media platforms, messaging services, and search engines.

Companies risk being severely punished if they do not meet the standards.

"We are conducting consultations on the power to impose substantial fines, block access to sites and possibly impose liability on individual members of management," the government said.

According to the proposals, a new regulator would have the power to require platforms and others to publish annual reports on transparency.

They would include the levels of harmful content on their sites and how they approached the problem.

The regulator will also be able to issue codes of practice that may compel companies to meet certain requirements, such as recruiting fact checkers, particularly during election periods.

"The era of self-regulation of online businesses is over," said Jeremy Wright, secretary of the Digital, adding that the sector needed to "be part of the solution".

"Those who fail to do this will face a difficult action," he promised.

Proponents of stricter regulation of social media have welcomed the proposals.

"It's been too long since social networks do not prioritize child safety and expose them to grooming, abuse and harmful content," said Peter Wanless, head of the National Society for the Prevention of Cruelty to Victims. the children.

"It is high time they were forced to act under this legally binding obligation to protect children, with heavy penalties if they do not."

[ad_2]

Source link