The ex-moderator of Facebook is suing Facebook for exposure to disturbing images



[ad_1]

Article intro image
Enlarge / Facebook CEO Mark Zuckerberg in 2017

Mark Zuckerberg

A Facebook content moderator, Selena Scola, sued the social media giant, claiming that her repeated exposure to graphic and disturbing images had given her post-traumatic stress disorder.

"In her office located in the Silicon Valley offices on Facebook, Ms. Scola has witnessed thousands of acts of extreme and graphic violence," says her trial.

The lawsuit quotes another Facebook moderator who told The Guardian last year: "You have to get to work every morning at 9 am, turn on your computer and watch someone get cut off their heads ". Some Facebook moderators are also exposed to child pornography.

"Scola has developed and continues to suffer from debilitating PTSD as a result of his work as a public content entrepreneur on Facebook," the lawsuit says. And, according to his lawyers, the situation goes against California workers' safety laws.

"We recognize that this work can often be difficult," said Facebook in an email. But the company argued that, contrary to the prosecution's allegations, Facebook provides extensive services to help moderators cope with the disturbing images they see at work.

This is a problem that is not limited to Facebook. Last year, Microsoft's moderators filed a lawsuit against Microsoft, presenting arguments similar to those of this week's Facebook trial.

Facebook says that he supports his moderators

The unpleasant reality is that someone will have to watch disturbing images submitted by users to platforms like Facebook. As long as user-generated content platforms exist, some users will submit this type of content, and no mainstream platform will want to expose its users to these images.

But Facebook can take steps to make the task less traumatic. The company could alert potential employees about the nature of the work, reduce the resolution of potentially graphic images, give employees shorter schedules and allow employees to review less offensive content for a few hours after moderating the most extreme graphic images .

Scola's lawsuits accuse a Facebook-backed organization called the Technology Coalition of publishing recommendations on best practices to help moderators, including limiting users' exposure time to disturbing images, offering advice and encouraging allowing moderators to move on to other tasks. According to Scola, Facebook has not been able to implement these recommendations.

But Facebook says it has taken important steps to protect moderators.

"We take the help of our content moderators very seriously, starting with their training, the benefits they receive, and ensuring that every person who reviews Facebook's content is provided with psychological support and resources. welfare ".

Scola has not officially worked for Facebook. Instead, she worked for an entrepreneur called Pro Unlimited who managed the moderators for Facebook. But Facebook says its contract with Pro Unlimited requires the contractor to "provide resources and psychological support, including onsite advice, to where the plaintiff has worked."

The problem of moderation of outsourcing

Workers like Scola represent the unchallenged belly of major social platforms like Facebook and YouTube. The very existence of content moderation teams is embarrassing for companies like Facebook that like to present themselves as automated and purely neutral online services, says Sarah Roberts, professor of information studies at UCLA .

"The existence of third human beings between users and platforms that made decisions about self-expression online from others was contrary to the premise" of platforms such as Facebook.

Companies like Facebook have grown to two levels. At the forefront are Facebook's official employees – engineers, designers, managers and other people enjoying high pay, lavish benefits and a lot of respect and dedication. autonomy.

Content moderators rank lower in the Facebook job hierarchy. Many of these workers are not officially Facebook employees: they are instead employed by outsourced companies, which severely limits their access and visibility to Facebook management.

And that, says Roberts, is the fundamental problem of Facebook's approach to moderation. Facebook has chosen to make content moderation low-paying, low-status work, she says. It is therefore not surprising that Facebook is facing allegations that this has not prioritized the psychological well-being of these same workers. The most important thing Facebook can do to help its moderators, she said, would be to make them formal employees so that they can enjoy the social status associated with a full-fledged employee. .

And this would not only benefit thousands of Facebook moderators, but also Facebook, because effective and efficient moderation is important for the long-term success of the company.

There are many examples of Facebook facing negative reactions to moderation decisions. Roberts highlights the controversy of 2016 when Facebook moderators censored a famous Vietnam war photo of a naked girl fleeing a napalm attack – Facebook has finally reversed its decision. The same year, Facebook fired a team of contractors who organized its "trend" activities after allegations that the team was biased against the Conservatives.

In a recent documentary, a British journalist became an infiltrator as Facebook moderator. "If you start censoring too much, people lose interest in the platform," a moderator told the reporter. "It's about making money at the end of the day."

Roberts argues that treating moderators as regular employees and offering them better pay, job security and working conditions that might accompany this change of status would allow Facebook 39, improve the quality of his moderation efforts. Facebook could probably hire higher caliber workers for these jobs. A lower business figure would allow Facebook to invest more in their training. The integration of these workers into Facebook's official workforce would improve communication between moderators and company management, which could allow officials to become familiar with moderation issues. faster.

However, the challenge of Facebook's moderation is largely due to the size and complexity of the task. With millions of content arriving every day, some mistakes are inevitable. Facebook has worked hard to ensure that its rules are applied consistently, but it is inherently difficult to ensure that thousands of people interpret exactly the same complex rulebook. Increasing the status of moderators should improve the quality of Facebook's moderation efforts, but it will probably not be a quick fix.

[ad_2]
Source link