[ad_1]
Selena Scola worked as a moderator of Facebook content less than nine months before developing post-traumatic stress disorder.
Scola, who was a contract employee of the social media giant, is now suing her former workplace for failing to protect her, along with thousands of other moderators, of psychological trauma.
A lawsuit filed in the state of California in the United States claims that Scola has witnessed thousands of acts of graphic violence in his role, involving the removal of online documents that violate Facebook's terms of use.
The social media giant employs or contracts at least 7,500 moderators around the world, who review more than 10 million reports of potentially reprehensible material every week, according to the lawsuit.
Scola claims that moderators do not have adequate protection to watch endless videos and images of "abuse, rape, torture, bestiality, decapitation, suicide, murder and other forms of extreme violence".
"Following a constant and total exposure to highly toxic and extremely disturbing images in the workplace, Ms. Scola has developed and is suffering from significant psychological trauma and post-traumatic stress disorder", says the lawsuit.
"Mrs. Scola's symptoms of post-traumatic stress disorder can occur when she touches a mouse, enters a cold building, watches television violence, hears loud noises, or is surprised, and her symptoms also occur when she is in the dark. she remembers or describes graphic images as a content moderator. "
The complaint also alleges that Facebook violated the law by ignoring the workplace safety standards it helped to create.
"Other technology companies have implemented these security standards, including providing moderators with strong and mandatory mental health counseling and supports; change the resolution, audio, size, and color of the devices. traumatic images and train PTSD moderators. "
"Instead, the multibillion dollar society requires its content moderators to work under conditions known to cause and exacerbate psychological trauma."
According to a blog post published earlier this year by Facebook's Vice President of Global Policy Management, Monika Bickert, the number of moderators has increased by 40% since 2017 and the company conducts a weekly audit of moderator decisions.
"Where mistakes are made, we follow up with team members to prevent them from happening again in the future," Bickert said.
READ MORE: Facebook reveals how good it is to moderate the content we see
Scola's lawyers have said that the psychological trauma and the cognitive and social problems they face are serious.
"They are ignored and the problems will only get worse – for society and for these individuals," said Steven Williams, of the law firm Joseph Saveri.
"This case is about protecting the people who protect the public. Content managers are human beings. They are not disposable. "
Korey Nelson's attorney for co-counsel Burns Charest LLP said they were seeking a class action lawsuit for prosecution because of "well-documented" evidence that repeated exposure to graphic images can have "profoundly negative effects".
"Facebook ignores its duty to provide a safe workplace and instead creates a revolving door of entrepreneurs who are irreparably traumatized by what they've seen on the job," Nelson said.
Scola has never been directly employed by Facebook. She was contracted through Pro Unlimited, also named in her claim.
Scola requires that Facebook and Pro Unlimited implement a test and medical treatment program for content moderators with PTSD.
Featured Image: Getty
[ad_2]
Source link