Facebook sued by former content moderator for psychological trauma, PTSD



[ad_1]

In this photo file of March 29, 2018, the Facebook logo appears on the screens of Nasdaq MarketSite in Times Square, New York. AP / Richard Drew

A former content moderator on Facebook sued the social media company, claiming that the prejudicial nature of his work was causing her psychological trauma and post-traumatic stress disorder (PTSD).

Selena Scola, of San Francisco, California, was employed by Pro Unlimited Inc. and worked as a public content company for Facebook from June 2017 to March 2018.

According to the California Superior Court in San Mateo, Scola has witnessed "thousands of acts of extreme and graphic violence" in his office in Facebook's Silicon Valley offices. These included images, videos and live broadcasts of rape, murder, torture and sexual abuse of children.

As Facebook users download millions of images and videos every day, Scola's task is to "maintain a clean platform" by browsing through publications and removing those that violate Facebook's terms of use. It has been reported that content moderators are "invited to examine more than 10 million potentially illegal publications per week".

According to the complaint, Facebook has ignored "workplace safety standards" despite writing such rules to protect content moderators such as Scola from workplace trauma.

"Instead, the multibillion dollar company is demanding that its content moderators work in conditions known to cause and exacerbate psychological trauma," he said. "By demanding that its content moderators work in dangerous conditions that cause debilitating physical and psychological damage, Facebook is violating California law."

In her name and on behalf of other content moderators like her, Scola sought to "stop these illegal and unsafe practices" and to ensure that Facebook and Pro Unlimited provide content moderators with ongoing psychological treatments and support on site and permanent.

In early July, Facebook responded to concerns about people viewing objectionable content on their site. In the statement, he admitted that reviewing a large amount of content was not easy because it had never been done before.

However, Facebook wrote that teams working in the field of safety and security had "doubled in size this year to reach 20,000 people. with."

Facebook added that there was a team of four clinical psychologists responsible for creating and delivering resilience programs to content moderators. Trained professionals are also available on site for individual and group consultations. / ra

Related stories:

Facebook Announces Tighter Policy on Firearm Sales

In the darkest areas of cyberspace, the danger lurks

Read more

Do not miss the latest news and information.

Subscribe to INQUIRE MORE to access The Philippine Daily Inquirer and more than 70 other titles, share up to 5 gadgets, listen to news, download as early as 4 am and share articles on social networks. Call 896 6000.

For comments, complaints or inquiries, contact us.

[ad_2]
Source link