Facebook makes changes in its current attempt to limit misinformation



[ad_1]

CNN – Facebook does a lot of little things to try to solve their biggest problems.

On Wednesday, the company announced more than a dozen updates on how it handles misinformation and other problematic content on Facebook, Instagram and Messenger. To promote the various efforts, the company organized around twenty journalists at the Menlo Park headquarters, at Menlo Park headquarters, where employees from various Facebook products recapped the changes and answered questions.

For years, Facebook has been exposed to the dissemination of controversial content on its platform, such as misinformation about elections, anti-vaccination stories, violence and hate speech.

Facebook has been trying to remove items that violate its rules more quickly, and"reduce" content delivery that does not explicitly break the rules, but remains inconvenient, such as clickbait and misinformation.

"We do not remove information from Facebook just because it's wrong, we think we need to find a balance," said Guy Rosen, vice president of Facebook for Integrity. "When it's false information by real people, we aim to reduce the spread and provide context."

For example, Facebook said that it would reduce the reach of groups that often share erroneous information. When users in a group frequently share content deemed false by Facebook's third-party fact checkers, the content of that group will be pushed further down the news thread, so that fewer people will see it. .

There will also be a "click-gap" signal, which will affect the position of a link in the news feed. With this feature, Facebook hopes to reduce the spread of websites that are extremely popular on Facebook compared to other parts of the Web.

He works with experts to identify new ways to combat false information on the platform. Associated Press is also expanding its work for Facebook's independent fact-checking program.

The company has often described its problems as problematic "accusatory" content. In the context of the company, it is to fight an enemy who learns and changes tactics. The package of changes announced Wednesday is his new weapon.

Facebook's policy prohibits content that it believes may result in "imminent physical violence". Employees defended Wednesday their decision not to ban any misinformation or anti-vaccination content on their products.

"When it comes to damage, it's really hard (…) to distinguish between content and what happens to offline people," said Tessa Lyons, head of the integrity of news feed on Facebook.

She said that some of the posts that appeared to be anti-vaccination involved questions, requests for information and conversations on the subject.

"There is a tension between allowing expression, speech and conversation, and ensuring that people get authentic and accurate information.We do not think that a private company should make decisions regarding information that may or may not be shared online, "she said. .

Renee Murphy, Senior Analyst for Forrester, a security and risk analyst, said that while Facebook's actions are positive, they are not doing enough to solve some of its bigger problems.

"Part of me says" awesome [this content] It will not go as far as usual, she says. The other party says, "I have no confidence in that." At the end of the day, what will all this do? How are they going to handle this? "

Facebook is also trying to be more transparent with users about how they make decisions and why. As part of its efforts, the company is adding a new section to its Community Standards website where users can see Facebook updates to its policies every month.

Another update allows users to delete comments and other content posted to a Facebook group after they leave.

Meanwhile, Instagram, owned by Facebook, tries to prevent the spread of inappropriate publications that do not violate its policies. For example, a sexually suggestive photo will always appear in a feed if a user follows this account, but it may no longer be recommended for the Explorer page or in the pages for hashtags.

Facebook also announced a few updates to its Messenger instant messaging service, including a Facebook-verified badge that would appear in the discussions.to help fight fraudsters who imitate public figures.

Another tool called the transfer indicator will appear in Messenger when a message is forwarded by the sender. WhatsApp, another Facebook-owned application, has a similar function, which is part of an effort to curb the spread of misinformation. WhatsApp had big problems with spreading viral hoax messages on the platform, which resulted in over a dozen lynchings in India.

Forrester's Murphy believes the company should do more to address major issues such as the violence being streamed and viral on the platform. Last month, an alleged terrorist was able to broadcast on Facebook a live video of a mass murder in New Zealand. The company said that its artificial intelligence systems had failed to capture the video and that it had captured 1.5 million videos of the attack in the first 24 hours.

"They have bigger problems, I'm sure [these updates] It will help sometimes, but there are bigger problems on foot, "she said. Facebook still has a lot to do. "

The-CNN-Wire ™ and © 2018 Cable News Network, Inc., a Time Warner Company. All rights reserved.

Related stories

Liesl Nielsen

[ad_2]

Source link