Former content moderator sues Facebook for giving him TSPT – Quartz



[ad_1]

A former content moderator on Facebook is suing the company for exposing her to countless painful images that she says have caused her post-traumatic stress disorder (PTSD), a condition frequently diagnosed in soldiers participating in combat, for example. She also alleges that Facebook has not provided adequate mental health care to moderators, who have to check daily some of the worst Internet content.

Former moderator Selena Scola filed a lawsuit in San Mateo, California, on Friday, September 21st. If the court agrees, it will become a class action because, according to the document, Scola's experience was "typical" of legions of moderators hired by Facebook. The complaint was first reported by the motherboard.

The complaint does not contain a specific account of Scola's day-to-day work because she is concerned about violating the terms of a non-disclosure agreement signed with Facebook. Scola worked as a content moderator from June 2017 to March 2018 at Pro Unlimited, a Facebook entrepreneur, also cited as a defendant in the lawsuit.

According to the complaint, we only know that she has been "exposed to thousands of images, videos and live violence broadcast graphics." She has developed a "debilitating PTSD" and continues to suffer from it today.

"Mrs. Scola's PTSD symptoms can be triggered when she touches a computer mouse, enters a cold building, observes violence on television, hears loud noises or is surprised, and her symptoms are also triggered when she starts. she recalls or describes graphic images to which she has been exposed as a content moderator.

Facebook had no comments at the time of publication.

Moderators can be at least partially protected from the harmful influence of disturbing content, and, as the complaint notes, Facebook has helped to create industry standards in this regard (including limiting exposure time, distorting content). images in various ways, and teach moderators coping strategies). But according to the complaint, "Facebook does not provide its content moderators with sufficient training or implement the security standards it has helped to develop".

About 7,500 content moderators are currently working on Facebook content. The company plans to increase this number as its user base grows and to continue to face controversy over the authorization of dangerous publications on the platform. Every day, reviewers examine about a million content items previously reported by users or artificial intelligence. Many of them are based abroad and receive low wages.

Former accounts of content moderators – working for Facebook and other technology companies – and investigations by undercover journalists shed light on the daily horrors of moderators, including beheadings, sexual violence against children,

Last year, a content moderator at Microsoft sued the company, claiming that after viewing images and videos of child abuse and murders, he was suffering from severe drug abuse. hallucinations and spent time with his son triggering traumatic memories.

In the case against Facebook, Scola wants the company to form a medical surveillance program to diagnose and treat content moderators in case of psychological trauma. The program would be supervised by a court and funded by Facebook.

[ad_2]
Source link