Visit Facebook: these people keep your own calendar



[ad_1]

"You get used to violence," says a young woman content moderator we are talking about on Facebook in Barcelona

She and her colleagues are being given daily a flood of violent messages. They must judge messages, photos and videos according to Facebook's house rules.

What comes are messages that have been reported by Facebook and Instagram users. But it can also be messages that capture artificial intelligence.

Think of reports of bullying, nude videos, terror videos, or such violent images.

The most stupid things

What moderators see is a surprise every time. "People report the stupidest things, for example that they do not like that their boyfriend is loved," said another moderator during a conversation with RTL Z and several other reporters. Dutch speakers, who for the first time had a behind-the-scenes glimpse at CCC.

The CCC is one of the companies that hire Facebook to evaluate everything people report. According to the social network, there is at least one center in each time zone, which allows you to intervene around the clock around the world

"Get used to it"

Photos, messages and videos reported can be very innocent. But moderators also see porn videos and pictures of suicidal people.

"It will not be easier to see the horrible things that are put in place, but if you see it more often, you'll get used to it a bit," says the boy who heads the Dutch team.

19659006] When RTL Z goes to the CCC building, there is no reason to believe that Facebook is working here. In one of Barcelona's tallest buildings, the company rents eight floors on which a hundred people have to clean Facebook, Instagram and Facebook Messenger.

Also on the facade of the building, you can not see the name or thumb of Facebook and nothing points in the lobby. The entrance is dull, with some colorless sofas, a library almost at the hip and sad objects. Plants to encourage them.

"We do not talk about these places"

It is clearly indicated to us when we enter that we are not allowed to film, except at predetermined locations by Facebook.

"We're talking about not having this place," says Facebook's Dave Geraghty, director of Facebook's global market operations. This is among other things to ensure the safety of employees, he said.

Geraghty mentions the example of the shooting at YouTube's headquarters in April. "It was an attack on YouTube, because someone did not agree with the decision to go on an offline video." We take our safety very seriously, "he said. that's why we do not announce it. "

Figures

Only since Facebook is released this year with figures on the frequency with which the company must intervene with messages on the social network. In total, content moderators watch – every day, 15,000 – more than 2 million messages per day

In a year, Facebook had to intervene for more than 3.9 billion messages. Spam is by far the largest category ($ 3.751 billion).

According to the most recent figures, during the months of July, August and September, we can see that Facebook has intervened in nearly 1.3 billion messages. The lion's share of these spam messages has been removed ($ 1.271 billion). In addition, 30.8 million sexist messages, 2.1 million intimidations and 15.4 million violent messages were reported. 1.5 billion fake accounts were also taken offline.

Human Activity

Facebook is also using more and more artificial intelligence. This allows for prior recognition of images, videos and messages, even before a Facebook user can see them. For example, he sees 99% of all terrorist messages before someone reports it.

In addition, the artificial intelligence itself automatically deletes a large number of messages. For example, he learned to recognize spam and delete it quickly. Previously deleted photos are easily intercepted before being uploaded and people have to watch them.

"One day we would like artificial intelligence to do all the repetitive work," says Siobhan Cummiskey, Public Policy Officer at Facebook. Nevertheless, intervention in all these messages remains an important part of people's work.

"We find that the content has to be seen by real people for more nuanced elements, such as hate propaganda and intimidation."

Because a moderator decides to delete a message once in ten, CEO Mark Zuckerberg said two weeks ago. "We can always do more, we have invested a lot in technology," said Cummiskey.

Spans, Drinks and Drugs

By inviting the media to look backstage, Facebook wants to avoid the misunderstandings of the world. help. No useless luxury, because last year, some messages from former employees of centers like this one have come out.

And the picture they describe is not rosy. People are overwhelmed by the violent images they see. And the consumption of alcohol and drugs is on the agenda, this is enough to cope with the stress generated by the work.

Two months ago, Facebook was sued by a former employee who suffered from PTSD by violent images that he saw. His complaint: the company does not provide enough guidance to fix it.

Video Run

Earlier this year, former employee Sjarrel collapsed from school, his last name that he would prefer not to mention . He worked in a similar center in Berlin and saw very strong images during his work. A number of pictures stayed with him. There was a violent rap video and an IS video run.

"It was an image of a man wearing an orange jumpsuit who was handcuffed, trying to escape and jump on an asphalt road."

The man is crushed by a tank. "After that, there was always a zoom on the body."

After watching the video, Sjarrel had to go out to calm down. But on his return, he was not taken care of by a manager. "That was the sign for me: I have to leave here because you do not have to rely on support here."

Psychological Assistance

The public policy officer is having trouble hearing that this type of message is being broadcast. "Because we spend a lot of time and energy taking care of our content reviewers," Cummiskey said.

She points out that employees can receive psychological assistance 24 hours a day. Five people are available for this in Barcelona. In addition, they can always take a break when they have difficulty, playing PlayStation or foosball in a large space specially equipped for this purpose, according to Cummiskey.

During our walk in the building, we also see the space. At the moment we are shooting there, two guys are playing a game of FIFA. We are allowed to film, but again, they are not recognizable in pictures.

Different image

The five content reviews we are talking about do not recognize themselves in the picture sketched by Sjarrel. The question is, how freely do they speak? The CCC leader is in conversation with the content moderators and the questions and answers must be given in English.

The response of one of the employees is clear. "There are no drug problems here." Another adds, "I do not recognize the image, nor do they recognize what they are talking about."

In the group discussion with content reviewers, the boss skips twice. Once it is said that the number of people leaving is very small

"People leave for better paying jobs, but it is not because of the nature of the work, they always feel that it has meaning, but free to say if things are different, "he said.

One of the men who does this job for a full year supplements and says that he feels appreciated. "We see a lot of sick things, but for our market, it's not so bad compared to the others, we feel we're appreciated."

Of the 70 Dutch working at the CCC in Barcelona, ​​only one is left to do something else.

[ad_2]
Source link