Visit Facebook: these people keep your own calendar



[ad_1]

Every day, two million messages, photos and videos on Facebook and Instagram are viewed by so-called content moderators. They watch child pornography, bullying and violent videos, so we do not have to see them in our calendar. RTL Z visited Dutch people who did their work.

"We get used to violence," says a young woman content moderator we talk about on Facebook in Barcelona.

She and her colleagues are given daily a flood of violent messages. They must judge messages, photos and videos according to Facebook's house rules.

What comes are messages that have been reported by Facebook and Instagram users. But it can also be messages that capture artificial intelligence.

Think of reports of bullying, nude videos, terror videos, or such violent images.

The most stupid things

What moderators see is a surprise every time. "People report the stupidest things, for example that they do not like that their boyfriend is loved," said another moderator during a conversation with RTL Z and several other reporters. Dutch speakers, who for the first time had a behind-the-scenes glimpse at CCC.

CCC is one of the companies hired by Facebook to evaluate everything people report. According to the social network, there is at least one center in each time zone, which allows you to intervene around the clock around the world

"Get used to it"

Photos, messages and videos reported can be very innocent. But moderators also see porn videos and pictures of suicidal people.

"It will not be easier to see the horrible things that are inserted in. But if you see it more often, you get used to it a bit," says the boy who heads the Dutch team.

19659007] When RTL Z goes to the CCC building, there is no reason to believe that Facebook is working in this area. In one of Barcelona's tallest buildings, the company rents eight floors where a few hundred people have to clean Facebook, Instagram and Facebook Messenger.

Also on the facade of the building, you can not see the name or thumb of Facebook and nothing points in the lobby. The entrance is dull, with some colorless sofas, a library almost at the hip and sad objects. plants to make it fun

Image © RTL Z

"We're not talking about these places"

Upon entering, we are told clearly that we are not allowed to film, except at locations predetermined by Facebook.

"We're not talking about these places," says Facebook's Dave Geraghty, Facebook's director of global market operations. This is among other things to ensure the safety of employees, he said.

Geraghty mentions the example of the shootout at YouTube's headquarters in April. "This was an attack on YouTube because someone was not in agreement with the decision to make an offline video and we take our security very seriously." why we do not disclose it. "

CCC employees are in the group row for the elevator
Image © RTL Z

Figures

It is only since this year that Facebook publishes figures on the frequency with which the company must intervene in reports on the social network. In total, content moderators watch – every day, 15,000 – more than 2 million messages per day

In a year, Facebook had to intervene for more than 3.9 billion messages. Spam is by far the largest category ($ 3.751 billion).

According to the most recent figures, during the months of July, August and September, we see that Facebook affects nearly 1.3 billion messages. The lion's share of these spam messages has been removed ($ 1.271 billion). In addition, 30.8 million sexist messages, 2.1 million intimidations and 15.4 million violent messages were reported. 1.5 billion fake accounts were also taken offline.

Image © RTL Z

Remains the human work

Facebook is also using more and more artificial intelligence. This allows for prior recognition of images, videos and messages, even before a Facebook user can see them. For example, he sees 99% of all terrorist messages before someone reports it.

Moreover, the artificial intelligence itself automatically deletes a large number of messages. For example, he learned to recognize spam and delete it quickly. Previously deleted photos are easily intercepted before being uploaded and people have to watch them.

"One day we would like artificial intelligence to do all the repetitive work," says Siobhan Cummiskey, Public Policy Officer at Facebook. Nevertheless, the intervention in all these messages remains largely a matter of human work.

"We find that the content has to be seen by real people for more nuanced elements, such as hate propaganda and intimidation."

But that does not always work well. Because a moderator decides to delete a message once in ten, CEO Mark Zuckerberg said two weeks ago. "We can always do more, we have invested a lot in technology," says Cummiskey.

A moderator at work
Image © RTL Z

Spanned, Drinking and Drugs

By inviting the media to take a look behind the scenes, Facebook wants to dispel misunderstandings. This is not a luxury, because last year, some messages from former employees of centers like this one have come out.

And the picture that they describe is not rosy. People are overwhelmed by the violent images they see. And the consumption of alcohol and drugs is on the agenda, this is enough to cope with the stress generated by the work.

Two months ago, Facebook was sued by a former employee who suffered from PTSD by violent images that he saw. His complaint: the company does not provide enough guidance to fix it.

Video of execution

A little earlier this year, former employee Sjarrel collapsed from school, his last name that he would prefer not not to mention. He worked in a similar center in Berlin and saw very strong images during his work. A number of pictures stayed with him. There was a violent rap video and an IS video run.

"It was an image of a man wearing an orange jumpsuit who was handcuffed, trying to get away and jumping on an asphalt road."

The man is crushed by a tank. "After that, there was always a zoom on the body."

After watching the video, Sjarrel had to go out to calm down. But on his return, he was not taken care of by a manager. "That was the sign for me: I have to leave here because you do not have to rely on support here."

Psychological Assistance

The Public Policy Officer has difficulty hearing that such messages are being broadcast. "Because we spend a lot of time and energy taking care of our content reviewers," says Cummiskey.

She points out that employees can receive psychological assistance 24 hours a day. Five people are available for this in Barcelona. In addition, they can always take a break when they have difficulties, playing PlayStation or foosball in a large space specially equipped for this, according to Cummiskey.

During our walk in the building, we also see the space. At the moment we are shooting there, two guys are playing a game of FIFA. We have the right to film, but again, they are not recognizable.

Two employees relax with a FIFA game.
Image © RTL Z

Different image

The five content reviewers we are talking about do not recognize themselves in the picture sketched by Sjarrel. The question is, how freely do they speak? The CCC manager is in discussion with the moderators of the content and the questions / answers must be given in English

The response of one of the employees is clear. "There are no drug problems here." Another adds, "I do not recognize the image, nor do they recognize what they are talking about."

In the group discussion with content reviewers, the boss skips twice. Once we say that the number of people leaving is very small.

"People leave for better paying jobs, but it's not because of the nature of the work, they always feel that it makes sense, but free to say if it's different" he says.

One of the men who does this job for six months adds and adds that he feels appreciated. "We see a lot of sick things, but for our market, it's not that bad compared to the others, we think we're appreciated."

Of the 70 Dutch working at the CCC in Barcelona, ​​only one is left to do something else.

[ad_2]
Source link