A Californian woman sued Facebook Friday for "exposing her to highly toxic, dangerous and harmful content during her job as a content moderator on Facebook."
Selena Scola was content moderator at Facebook's headquarters in Menlo Park, California, from June 2017 to March this year, according to the lawsuit. She worked for a contractor called Pro Unlimited, Inc., which helps Facebook remove content that violates its community standards. Facebook has about 7,500 content moderators around the world, tasked with removing hate speech, graphic violence and harmful images and video, nudity and sexual content, bullying and a host of other content violating its policies. .
Scola's attorneys claim that she has developed post-traumatic stress disorder as a result of "constant and unconditional exposure to highly toxic and extremely disturbing images in the workplace" and allege that Facebook does not have. The case has been dismissed as a class action, but for the moment, Scola is the only named plaintiff; the lawsuit refers to a potential class of "thousands" of current and former moderators in California.
The lawsuit does not currently include specific details about Scola's work and instead relies on press inquiries regarding the operation of content moderation; Scola's lawyers told the motherboard that she would detail them in the court proceedings. "This complaint does not include these [specifics] because Ms. Scola fears that Facebook may retaliate against her by using an alleged non-disclosure agreement.
Moderation of content is hard work – several documentaries, long-term surveys and legal articles have noted that moderators work long hours, are exposed to disturbing and graphic content and have the difficult task of determining whether specific content sometimes violates rules in constant evolution. Facebook prides itself on accuracy and, with more than 2 billion users, Facebook's moderators are asked to review millions of potentially infringing messages each day.
"A stranger may not fully understand, we are not just exposed to graphic videos – you will have to watch them closely, often repeatedly, for specific policy signifiers," said a source of moderation. in a video, and you might have to watch it a dozen times, sometimes with other people, while deciding whether or not the actions of the victim would count for self defense, or if the aggressor is the same person. "
The source said they were "not at all surprised" that Facebook is now facing a lawsuit. "It's something that some colleagues and former coworkers talk about."
Another source of moderation on Facebook told Motherboard: "I'm not surprised."
The lawsuit alleges that "Facebook is not providing its content moderators with sufficient training or implementing the security standards it has helped to develop … Ms. Scola's PTSD symptoms can be triggered loud noises, or is surprised . Her symptoms are also triggered when she recalls or describes graphic images to which she has been exposed as a content moderator.
Facebook did not immediately provide Motherboard with a comment on this article regarding the details of the lawsuit. However, at the beginning of the year, when we visited Facebook's headquarters, several high-level employees told us that the company was striving to make the work less stressful and potentially traumatic for its moderators. The company has put in place specific training protocols for content moderators, even though the lawsuit alleges that they are inadequate.
"There are real physical environments in which you can come in, if you just want to have a little spice, or if you want to play a game, or if you just want to go, you know, be alone, this support system is quite robust "
"This work is not for everyone, frankly, and we recognize it," said Brian Doegan, Global Training Director of Facebook, Community Operations, in June. He said that new recruits are gradually exposed to graphic content "not just to expose you drastically, but rather to have a conversation about what it is, and what we are going to see".
Doegan said there are rooms in every office designed to help employees relax.
"What I admire, is that at any point in this role, you have access to counselors, you have access to conversations with other people," he said. -he declares. "There are real physical environments in which you can go, if you just want to get a little messed up, or if you want to play a game, or if you just want to go, you know, being alone, this support system is quite robust, and this is consistent at all levels. "
Carolyn Glanville, a spokeswoman for Facebook, told Motherboard in June that each office and content-moderation contractor provides mental health services, but that the types of services offered vary according to the practices recommended by their culture.
"While [in some countries] it 's good to just go through the lobby for an advisor, and they do not care, in other cultures, they do not, they would do it without hours and d & # 39; others may not know it, "she said. .
The Scola lawsuit asks the court to create a "Facebook-funded medical monitoring program to facilitate the diagnosis and treatment of complainants and class for psychological trauma, including, but not limited to, PTSD. ".
Then a judge in California will decide if the case is worthwhile enough to move forward.