We need to talk about the mental health of content moderators



[ad_1]

Facebook moderators look at hundreds of examples of distressing content during their teams. Credit: Shutterstock

Selena Scola worked as a public content contractor or content moderator for Facebook in her Silicon Valley office. She left the company in March after less than a year.

In papers filed last week in California, Scola alleges that dangerous work practices led her to develop post-traumatic stress disorder (PTSD) by observing "thousands of acts of extreme violence and graphic".

Facebook acknowledged that moderation work is not easy in a blog post published in July. In the same article, Facebook's vice president of operations, Ellen Silver, described some of the ways in which the company supports its moderators: "All content reviewers, which they work full time, under contractors or employees of partner companies, have access to locally trained professionals for individual and group consultations. "

But Scola says Facebook does not practice what he preaches. Previous reports on working conditions also suggest that the support they provide to moderators is not enough.

This is not the first time

The court case of Scola is not the first of its kind. Microsoft has been involved in a similar case since December 2016 involving two employees who worked on their child safety team.

In both cases, the complainants allege that their employer did not provide sufficient support, despite the knowledge of the psychological dangers of the work.

Microsoft and Facebook dispute the claims.

How moderation can affect your mental health

Facebook moderators look at hundreds of examples of distressing content during each eight-hour period.

They evaluate positions including, but not limited to, representations of violent death – including suicide and murder – self-injury, assaults, violence against animals, speeches hate and sexual violence.

Studies in areas such as child protection, journalism, and law enforcement show that repeated exposure to these types of content has serious consequences. This includes the development of PTSD. Workers also experience higher rates of burnout, relationship breakdown and, in some cases, suicide.

Are there any guidelines in the workplace?

Industries, including journalism, law and law enforcement, have invested a great deal of thought and money into best practices designed to protect workers.

In Australia, for example, people working in the field of child safety accept work rather than assigning records. They are then required to conduct rigorous psychological tests to determine if they are able to compartmentalize the work effectively. Once worked, they regularly hold counseling sessions and are regularly reassigned to other areas of investigation to limit the amount of exposure.

The technology industry has similar guidelines. In fact, Facebook has helped create Coalition Technology, which aims to eliminate the sexual exploitation of children online. In 2015, the coalition released its Employee Resilience Guide, which describes workplace health and safety measures for workers who regularly visit hard materials. Although these guidelines are specific to workers who look at child pornography, they also apply to all types of distressing images.

The guidelines include "the provision of mandatory group and individual counseling sessions" with a trauma specialist and the "opportunity for moderators to not participate in the child pornography consultation.

The guidelines also recommend limiting exposure to disruptive materials at four o'clock, encouraging workers to move on to other projects to get relief and allowing workers to go away to recover. of a trauma.

But it's not just guidelines

Having available support does not necessarily mean that staff feel they can access it. Most of Facebook's moderators, including Scola, work under precarious employment conditions as external contractors employed by third-party companies.

Working under these conditions has proven to have a negative impact on the well-being of employees. In fact, these types of employees are not only less likely to have access to support mechanisms, but they often think they are at risk of losing their jobs. In addition, low wages can prevent employees from taking time off to recover from trauma.

Unsafe work can also impact the sense of control. As I mentioned earlier, moderators have little or no control over their workflow. They do not control the type of content that appears on their screen. They have little time to make decisions, often with little or no context. And, they do not have a say in how these decisions are made.

According to the filing and media reports on Facebook moderator 's terms of employment, employees are under tremendous pressure from the company to spend thousands of publications a day. They are also regularly audited, which adds to the stress.

Where to go from here?

Adequate support at the workplace is essential for moderators. Some sections of the industry provide us with the best case examples. In particular, the support provided to those working in online mental health communities, such as Beyond Blue in Australia, is exemplary and a good model.

We must also address the ongoing problem of precariousness in an industry that requires people to put their mental health at risk every day. This requires good governance and representation of the industry. To this end, officials from the Australian community have recently partnered with MEAA to promote better conditions for all actors in the sector, including moderators.

As for Facebook, Scola's lawsuit is a class action. If it succeeds, Facebook may find itself compensating hundreds of moderators employed in California over the past three years. It could also set a precedent at the industry level, opening the door to complaints from thousands of moderators employed in various technology and media industries.


Explore more:
The rate of PTSD among prison staff equals that of veterans

Provided by:
The conversation

[ad_2]
Source link