Content moderator says he had PTSD by reviewing images posted on Facebook



[ad_1]

SAN FRANCISCO – A former content moderator on Facebook is suing the company on the grounds that daily editing of disturbing news caused him psychological and physical harm, according to a lawsuit filed Monday in a California high court.

The complaint of former moderator Selena Scola, who worked at Facebook from June 2017 to March, alleges she has witnessed thousands of acts of extreme graphic violence "in her office of Facebook's Silicon Valley offices. Where Scola was in charge of Extended rules prohibiting certain types of content on their systems.

Scola, who worked at Facebook through a third-party company, developed a post-traumatic stress disorder "as a result of constant and unmitigated exposure to highly toxic and extremely disturbing images on work place".

Facebook has not responded to a request for comment.

Facebook relies on thousands of moderators to determine if the messages violate its rules against violence, hate speech, child exploitation, nudity and misinformation. Many reprehensible categories come with their own sub-lists of exceptions. It employs 20,000 content moderators and other security specialists in locations such as Dublin, Austin, and the Philippines, in response to allegations that the company has not adequately addressed the abuse of its services, including l Russian interference, illegal. drug content and fake news.

The social network claims that in recent years it has developed artificial intelligence to detect problematic messages, but the technology is not sophisticated enough to replace the need for human labor.

Facebook is under scrutiny from politicians and lawmakers, who have convened leaders at two hearings at Capitol Hill this year and are considering new regulations that would place corporations at a higher level of responsibility for their business. Platforms.

The complaint also accuses Pro Unlimited, a subcontracting company based in Boca Raton, Florida, of violating California's workplace safety standards.

Pro Unlimited did not respond to a request for comment.

The lawsuit does not elaborate on Scola's unique experience because she has signed a confidentiality agreement that limits what employees can say about their work time. Such agreements are the norm in the technology sector, and Scola fears retaliation if she violates it, says the lawsuit. His lawyers are considering challenging the NDA, but are waiting to provide more details until a judge weighs.

The lawsuit notes that Facebook is one of the leading companies in an industry-wide consortium that has developed workplace safety standards for the field of moderation. The complaint alleges that Facebook does not meet the standards it has helped to develop, unlike its counterparts in the industry.

In late 2016, two former content moderators filed a lawsuit against Microsoft, claiming that they had developed PTSD and that the company was not providing adequate psychological support.

The Scola Pursuit is asking Facebook and its outsourcing companies to provide content moderators with on-site and ongoing mental health treatment and support, and to create a medical tracking fund for alumni and moderators.

Facebook has always been discreet about its moderator program. The guidelines used by moderators to make decisions were secret until this year, when the company published some of it publicly. The company refused to disclose information about the moderators' workplace, as well as about the hiring practices, performance goals, and working conditions of the moderators.

[ad_2]
Source link