[ad_1]
A former content moderator who worked under contract for Facebook filed a lawsuit against the company saying that being bombarded with thousands of violent images on his computer in Silicon Valley led her to develop a stress disorder post -traumatic.
Former moderator Selena Scola says Facebook has failed to protect her and other entrepreneurs while they watch videos and painful photos of rape, suicide, beheading and other killings, according to the report. complaint filed on Friday in the San Mateo County Superior Court.
Ms. Scola, who worked for the company for nine months, said in her complaint that her post-traumatic stress disorder had been triggered "when she touches a mouse, walks into a cold building, looks at her." violence on television, hear loud noises. or is surprised. "
According to the lawsuit, 7,500 Facebook moderators around the world are sifting through 10 million potentially breaking messages.
The company relies on its two billion users to report inappropriate content. The moderators then use the hundreds of rules that Facebook has developed to determine if the content violates its policies.
"We recognize that this work can often be difficult," said Bertie Thomson, director of corporate communications at Facebook, in a statement. "That's why we take very seriously the assistance of our content moderators, starting with their training, the benefits they receive and ensuring that every person who reviews the content of Facebook is offered psychological support and wellness resources.
Ms. Scola urges Facebook to create a fund to create a screening and treatment program whereby current and former content moderators – including moderators employed by a third party – can receive medical testing and monitoring, including psychiatric treatment . She also asks that her legal fees be paid by Facebook.
Ms. Scola's lawyers stated that their client was not currently giving an interview.
"What can cause PTSD has been the subject of countless articles and speculation long before it was an official diagnosis," said Dr. Elspeth Cameron Ritchie, retired psychiatrist and colonel, former advisor the Pentagon on mental health issues. "In the vast majority of people, just seeing violent images is not enough, but in some people it could be."
"People who operate drones and watch them explode are suffering from PTSD even if they are not in the same room," she added. "I'm not saying it could not happen, but we do not see it much."
Facebook employees receive in-house psychological support, according to Thomson.
"We also require that companies with whom we collaborate to review the content provide resources and psychological support, including on-site counseling, available at the applicant's place of work and other resources. well-being Thomson said in a statement.
In May 2017, Mark Zuckerberg, CEO of Facebook, acknowledged that violent content was a problem for the company. He pledged to hire 3,000 more people to moderate more closely what was posted on Facebook and said the company was creating more tools to simplify the way users report content. "If we want to build a safe community, we need to react quickly," he said.
Some of the violent acts on Facebook include:
• The murder of a father in April in Thailand by his 11-month-old daughter, whom he captured live before hanging himself
• The suicide of a 14 year old who was living in a Florida host family
• Minnesota police officer kills Philando Castile
Many content moderators have publicly discussed the difficulty of their work.
"You go to work every morning at 9 am, turn on your computer and watch someone get their heads cut off," The Guardian told a man last year who chose to remain anonymous. "Every day, every minute is what you see. The heads are cut.
Source link