Facebook wins the false war of the news?



[ad_1]

<! –

->

For people hired by Facebook to fight against false information and misinformation, there is doubt about them every day. Does it work?

"Are we changing minds?" wondered a Latin American-based investigator who was speaking to the BBC.

"Does it have an impact? Is our work being read? I do not think it's hard to keep track of that. But it's not a priority for Facebook.

"We want to better understand what we are doing, but we are not able to."

More than two years after its creation and on International Fact Finding Day, many sources in agencies working on Facebook's global fact-finding initiative told the BBC that they felt underused, poorly informed and often ineffective.

An editor explained how their group would stop operating near the limit of payment – a maximum number of fact checks in a month for which Facebook is willing to pay.

Others said that they thought Facebook was not listening to their comments on how to improve the tool provided to sift through the content identified as "a false news" .

"I think we see the partnership as important," said a publisher.

"But there are so many things that can be done without the contribution of both sides."

As the United States prepares to embark on a new grueling presidential campaign, experts believe that Facebook remains ill equipped to counter false information.

Despite this, Facebook is satisfied with the progress made so far – citing an external study suggesting that the number of false information shared on its platform was decreasing.

In the wake of Trump

Facebook requires its fact-finders to sign non-disclosure agreements that prevent them from speaking publicly about certain aspects of their work.

In order not to identify the source of information, the BBC has chosen to make its sources anonymous and avoid using certain specific numbers that may be specific to individual contracts.

Facebook launched its fact-checking program in December 2016, just over a month after Donald Trump's election to the US presidency.

It was a victory that some thought was helped by misinformation on social media, mainly Facebook.

At the time, the founder and general manager, Mark Zuckerberg, had stated that such a notion was "crazy" – although he later declared to a congressional committee that "no one" was in the mood. he regretted using the term.

Facebook now has 43 fact-finding organizations working with them around the world covering 24 different languages.

Groups use a tool created by Facebook to filter content that has been reported as potentially false or misleading.

The report is made either by Facebook's algorithm or by human users who post articles that, in their opinion, might be inaccurate.

Investigators will then review the potential claims and produce their own "explanatory article".

If the content is considered to be misleading or downright false, the users who post it are supposed to receive a notification, which results in the article being displayed in a less visible way.

A pop-up message informing them of the information verifiers' concerns for those who are trying to post the content after its verification is posted.

For each explanatory article, Facebook pays a fixed amount, which in the United States amounts to about $ 800 (£ 600), according to the contracts described in the BBC.

Investigators in developing countries appear to be receiving about a quarter of this amount.

Tools slaughtered

What has not yet been reported, however, is that at the beginning of 2019, Facebook has introduced a payment cap: a monthly limit of explanatory articles after which the fact-checking agencies would not be remunerated for their work.

Facebook logo

In general, the limit is set at 40 articles per month and per agency, even if the group works in several countries.

It is only a fraction of the total number of tasks to be completed: a screenshot of the Facebook tool, conducted last week by a fact checker in a country of Latin America, showed 491 items pending verification.

Facebook has confirmed what it called an "incentive structure" for payment, a structure that has grown during peak periods, such as elections.

The company said the limit was created in accordance with the capabilities of the fact-checking companies and that it was rarely outdated.

However, some groups told the BBC that they "would never have a problem" to reach the limit.

An editor said his staff would simply stop submitting comments to the Facebook system once the deadline was near, so as not to check the facts for free.

"We are still working on things, but we will stick to next month," they said.

Dissatisfaction

Snopes website

Image caption: Snopes no longer works with Facebook

Earlier this year, US agency Snopes announced that it was ending its work with Facebook.

"We want to determine with certainty that our efforts to help a particular platform are a net benefit to our community, our publication and our online staff," Snopes said in a statement.

The Associated Press, another important partner, told the BBC that she was still negotiating her new contract with Facebook. However, the AP does not appear to have fact-checked directly on Facebook since the end of 2018.

The Snopes statement echoed the concerns of those still in the program.

"We do not know how many people have been affected," said a publisher.

"I think we are missing very important information about people who regularly post false information on Facebook."

& # 39; Room to improve & # 39;

A Facebook spokesperson told the BBC that the company was working to improve the quality of its fact-checking tools and more open data.

"We know we can always improve," the company said.

"We will therefore continue to discuss with our partners how we can be more efficient and transparent in our efforts."

The company said it has recently started sending quarterly reports to agencies.

These contain snapshots of their performance, such as the portion of users who decided not to post content after being warned that it was unreliable. A document consulted by the BBC suggests, in one country at least, more than half.

WhatsApp

Image caption: The spread of fake news on the Facebook-owned messaging platform WhatsApp also raises concerns.

But the problem is changing rapidly.

In addition to Facebook's main network, its WhatsApp messaging application has been at the center of a number of brutal attacks, apparently motivated by fake news shared by private groups.

Although organizations that verify the facts are trying to dispel the dangerous rumors within WhatsApp, Facebook has not yet provided a tool, although it is experimenting with some ideas to help users to report their problems.

Opening

These challenges did not come unexpectedly to those who have studied the effects of misinformation closely.

Claire Wardle, president of First Draft, an organization that supports efforts to combat misinformation online, said the only way for Facebook to truly solve its problems was to give third parties better access to its technology.

"From the beginning, my frustration with the Facebook program is that it's not an open system," she told the BBC.

"With a closed system owned by Facebook, Facebook paying factual verifiers to do this work only for Facebook, I do not think this is the type of solution we want."

Instead, she suggested, Facebook needs to explore the possibility of crowdsourcing fact checks with a much wider source of expertise – something that Mark Zuckerberg seems to be considering. Such an approach would, of course, bring new problems and attempts to thwart the system.

So, for the moment at least, and despite their serious reservations, most people who fight against misinformation on Facebook have committed themselves to pursue what will become an increasingly sisyphic experience.

[ad_2]
Source link