Moderating the content does not have to be so traumatic



[ad_1]

A lasting trauma does not have to be included in the job description of workers who watch violent videos or videos as part of their work. Content moderators, journalists and activists often have to go through horrific images, videos and texts to do their work, a task that can threaten their mental health and lead to post-traumatic stress disorder. Fortunately, experts say that there are ways to minimize the harms of a so-called vicarious or secondary trauma.

Post-traumatic stress disorder (PTSD) is caused by living or attending a terrifying incident. Symptoms include acute anxiety, backtracking, and intrusive thoughts. Although most people think of post-traumatic stress disorder in the context of war or their physical involvement in a crisis, there has been a growing acceptance in recent years. visualization traumatic events can also cause illness. Pam Ramsden, a psychologist at the University of Bradford in the UK, presented a study at a conference organized by the British Psychological Association in 2015, which revealed that a quarter of people who watched painful images of violent events developed symptoms of PTSD.

At the same time, a study by Roxane Cohen Silver, a psychologist at the University of California at Irvine, showed that more than six hours of exposure to the Boston Marathon bombings (where the loonie is being held) is still in danger. exposure can be done by any means of media) during the four weeks following the attack have been linked. to more stress than having been actually there. Even the latest version of the Diagnostic and Statistical Manual, the bible of psychiatric diagnosis in the United States, recognizes that PTSD can occur when visualization of graphic images is necessary at work. For example, a company's Facebook content moderators in Arizona face serious mental health problems related to the constant search for graphic images, such as a survey by: The edge this week revealed.

This does not mean that everyone who sees these images will be traumatized. Some people search for traumatic content without being affected. "Not even one hundred percent of people who go to war do PTSD, so there are differential risk factors," says Cohen. "Certainly, there are ways to ease stress and take breaks and not look at something eight hours a day without a break."

While research in this area is sparse, the Dart Center, which supports journalists who cover violence, has created two tips sheets on best practices in treating traumatic imaging. Although some tips may be given by a moderator – for example, reducing the image window, taking notes to minimize the need for successive round trips on a sequence, and having "distraction files" of cute puppies to look at – many are only in the power of managers to implement.

"There must be plenty of opportunities to take breaks and confuse tasks, and offices where people can focus on something beautiful," says Elana Newman, research director at Dart Center and psychologist at University of Tulsa. "They need to regularly monitor their staff for mental health issues and provide these services." Newman and other experts agree that it ultimately falls to the company itself to even make these changes to protect workers.

Sam Dubberley, director of Amnesty International's Digital Verification Corps, is a group of volunteers who need to confirm whether digital images are real and, therefore, frequently look at traumatic images. (Dubberley also worked with the Dart Center.) "I am a strong believer that change must be hierarchical," he said. Dubberly has conducted his own research on what he calls "drip drip and constant misery" consisting of looking at graphic images online and, through interviews with people sitting on these "digital frontlines", created a report containing more suggestions for people at all levels. organization.

According to his report, a healthier environment could mean that moderators use mindfulness tools, frequently check their own pace of work, and learn to focus on images that give them confidence. Perhaps most importantly, this entails integrating trauma awareness training for everyone, which means that all hired people need to be informed that troubling graphics will be part of the job and that they will be more effective. we must not forget that traumatic triggers differ from one person to the other. It also means developing a culture in which mental health is as important as physical health, and leaders need to discuss individually and in groups with staff how they are doing.

Although the efforts of journalists and human rights organizations to prevent secondary traumas are increasing, the situation is more complex with respect to content moderation. Sarah T. Roberts, professor of information studies at UCLA, has become increasingly aware of these issues over the past two years and social media companies are aware of them. For example, YouTube CEO Susan Wojcicki told SXSW last year that the company would restrict moderators to four hours a day.

At the same time, these companies are responding to all forms of controversy – whether it's "fake news" or controversial ads – by promising to recruit more moderators and improving access control. more people in these positions even though we still do not have detailed details. research on how to protect them. The resources created by the Dart Center and Dubberley are general guidelines extracted from interviews and are not the result of rigorous longitudinal studies on content moderation and secondary trauma. We do not have these studies because their implementation requires access that these companies will probably not allow.

"My immediate answer to [the YouTube news] "It's okay, that means you'll have to double your squad," and we have no evidence that four hours a day is the magic number, "says Roberts. "How do we know that four hours a day are achievable and that four and a half hours lead to a psychological crisis?"

Finally, the business model of content moderation makes it difficult to implement suggestions, even based on common sense. Since the ecosystem of moderation of commercial content is global and often based on contracts, companies are often afraid of losing a contract to a company that can prove it is more efficient. "There are these countervailing forces," says Roberts. "On the one hand, we want workers' well-being and resilience, but on the other hand, we measure productivity through productivity. When a person enjoys a "well-being break", she does not participate in these activities. "

Ultimately, change must occur on many levels – not just distraction files or artificial intelligence, but also by changing the culture of an organization and looking more closely at business models. Facebook is committed to improving the oversight of its outsourcing companies, but the protection of content moderators will require an effort on the part of the industry.

[ad_2]

Source link