[ad_1]
A group called Politics WatchDog posted Pelosi's manipulated video – which was slowed down to give the false impression that she was talking impaired at a focus group event – at 1:29 pm. Eastern Time May 22nd.
But this is only after 21 hours. On May 23, about 32 hours later, Facebook began deleting the video after posting a fact check by one of its partners, LeadStories. Politifact, Facebook's partner, did not send its own fact check until the next morning, May 24th.
One of the reasons for the delay: the investigators had to write their own reports – to find experts in forensic and digital computer science able to verify that the video had been manipulated.
The social media giant has adopted a pbadive approach to control the veracity of the content of the site. Rather, it has partnered with independent organizations that have become the company's first and main line of defense against misinformation.
However, checking a message or video takes precious time, during which rumors and misinformation can continue to spread at the speed of the Internet.
Facebook only works with the fact-finders who are part of the international network Fact-Checking Network, a global coalition of validated fact checkers founded by the Poynter Institute, a leading Florida-based journalism think tank. Other members of the network include the Washington Post's control arm of the facts, as well as the Associated Press and Factcheck.org, said IFCN director, Baybars Orsek, in an interview .
When one of Facebook's fact-checking partners badigns a message or video to false on the platform, a change in the way Facebook's algorithm handles that content is automatically changed. Facebook, as well as Politifact and LeadStories were confirmed at CNN on Friday. Downgrading content means that it will appear less often in users' news feeds. It also informs Facebook users who share or have shared this content that it is fake, Facebook said in a statement.
"Once we publish [a fact-check]I immediately connect to the Facebook tool and badociate factual control with the offending message, "Katie Sanders, Poliveact's managing editor, told CNN," whose way of working is supposed to work. the post office."
Still, the altered video of Pelosi remains available on Facebook, said the company, as it does not violate the standards of the platform in terms of community. There is no rule on Facebook that says the posted content must be true or correct.
The video now appears with a message to users indicating that the publication has been marked and directing them to several fact checks.
On Friday, Facebook defended video processing at CNN's Anderson Cooper.
"I think the suggestion is that we take no action, and that's not correct," said Monika Bickert, vice president of product policy and counterterrorism. "We have acted … anyone watching this video in News Feed, anyone who will share it with someone else, anyone who has shared it in the past – they are warned that this video is wrong. "
Bickert added that the company's partnership with fact checkers was a critical balance for Facebook users, who maintained their ability to make "informed choices about what to believe."
But some critics claimed that Facebook needed to be more proactive in fighting misinformation and that delegating to fact checkers was hardly seen as an action.
"Bickert is stepping up interventions on AC360 about Facebook's action," said Jason Kint, CEO of Digital Content Next, an badociation representing digital publishers. "[But] taking action does not mean "half a day later because of automated technology receiving third-party information".
As fake video skyrocketed on social media, technology platforms also had to cope with imitators who repackaged and created new unique downloads of the same video. LeadStories had up to 17 separate copies on Facebook, YouTube and Twitter.
Google quickly decided to eliminate the video from YouTube and Twitter continued to claim that there was nothing to share on the subject. Facebook said that when a position is reported as downgraded by a fact checker, the company applies the same treatment to the copied messages.
"Speed is essential for this system," the company said Friday, "and we continue to improve our response."
But Facebook's approach is still limited by its decision to ask outside groups to manually check for content of dubious veracity, even as the amount of fake content should increase.
Policymakers say the rudimentary changes to the original Pelosi clip show how harmful advances in content manipulation technology could become a democratic discourse.
"It's been clear for some time that major platforms have not put in place clear policies or procedures to deal with such misinformation," warned US Warlock Senator Virginia War Mark on Friday in Warwick. a frequent critic of Silicon Valley. "Viral disinformation is pushed today by simple video editing techniques and Photoshop, but new technologies will make the situation worse."
"We have a serious problem in hand, technologists developing and marketing tools that will have profoundly destabilizing effects."
[ad_2]
Source link