Facebook whistleblower revealed on ’60 Minutes’ says company prioritized profit over public good



[ad_1]

The 37-year-old former Facebook product manager who has worked on civic integrity issues at the company says the documents show Facebook knows its platforms are being used to spread hatred, violence and disinformation, and that the company tried to hide this evidence.

“What I saw on Facebook over and over again was that there was a conflict of interest between what was good for the public and what was good for Facebook, and Facebook chose over and over again to ‘optimize for his own interests, such as earning more money, “Haugen said” 60 minutes “.

“60 Minutes” correspondent Scott Pelly quoted a Facebook (FB) document as saying, “We have evidence from various sources that hate speech, divisive political speech and disinformation on Facebook and the family of apps affects societies around the world.”
About a month ago, Haugen filed at least eight complaints with the Securities and Exchange Commission alleging the company is hiding research into its shortcomings from investors and the public. She also shared the documents with the Wall Street Journal, which published a multi-part survey showing that Facebook was aware of issues with its apps, including the negative effects of misinformation and the harm done, especially to young girls, by Instagram.
Haugen, who started at Facebook in 2019 after previously working for other tech giants like Google (GOOGLE GOOGLE) and Pinterest (PINS), is due to testify Tuesday before the Senate subcommittee on consumer protection, product safety and data security.

“I’ve seen a bunch of social media, and it was a lot worse on Facebook than anything I’ve seen before,” Haugen said. “At some point in 2021, I realized that I’m going to have to do this systemically, that I’m going to have to get out enough [documents] that no one can question that it’s real. ”

Facebook has aggressively pushed back the reports, calling many claims “misleading” and saying its apps do more good than harm.

“Every day, our teams must balance protecting the ability of billions of people to speak out openly with the need to keep our platform a safe and positive place,” said Facebook spokesperson Lena Pietsch, in a statement to CNN Business immediately after the “60 minutes” interview. “We continue to make significant improvements to combat the spread of disinformation and harmful content. To suggest that we promote bad content and do nothing is just not true.”

Several hours after the interview aired, Pietsch released a more than 700-word statement describing what he called the segment’s “missing facts”, and saying that the interview “used selected materials from the company to tell a deceptive story about our research to improve our products. “

A spokesperson for “60 Minutes” did not immediately respond to a CNN Business request for comment on Facebook’s claims.

On Sunday morning, ahead of the “60 Minute” interview, Facebook vice president of global affairs Nick Clegg told CNN’s Brian Stelter that “there isn’t as much perfection on social media as in any other field “.

“We do a huge amount of research, we share it as much as possible with external researchers, but remember that there is… a world of difference between doing a peer-reviewed exercise in cooperation with other academics. and prepare articles internally to provoke and inform internal discussion, ”said Clegg.

Haugen said she believed Facebook founder and CEO Mark Zuckerberg “never sought to create a hateful platform, but he did make choices where the side effects of those choices are that the hateful and polarizing content gets more distribution and more reach “.

Frances Haugen, former Facebook product manager, was revealed on Sunday as the whistleblower who leaked thousands of pages of internal research and documents that created a storm for the social media company.

The whistleblower revealed

Haugen said she was recruited by Facebook in 2019 and accepted the post to fight disinformation. But after the company decided to disband its civic integrity team shortly after the 2020 presidential election, its feelings about the company began to change.

She suggested that the move – and the steps the company took to deactivate other electoral protection measures such as disinformation prevention tools – allowed the platform to be used to help organize the campaign. January 6 riot on Capitol Hill.

“Basically they said, ‘Oh well, we’ve been through the election, there haven’t been any riots, we can get rid of civic integrity now,'” she said. “Fast forward a few months, and we had the Uprising. When they got rid of civic integrity, that was the moment I was like, ‘I don’t think they’re ready to invest this. that needs to be invested to prevent Facebook from being dangerous. ‘”

Facebook says the civic integrity team’s work was distributed to other units upon its disbandment. Facebook Vice President of Integrity Guy Rosen said on Twitter Sunday evening that the group was integrated with other teams so that “the pioneering work for the elections can be applied even further”.

The social media company’s algorithm designed to show users what content they’re most likely to engage with is responsible for many of its problems, Haugen said.

Facebook grilled by the Senate on the company's impact on children

“One of the consequences of how Facebook selects this content today is that it optimizes content that elicits engagement, reaction, but its own research shows that this content is hateful, divisive, polarized, c ‘is easier to inspire people with anger than other emotions, “she said. She added that the company recognizes that” if they change the algorithm to be safer, people will spend less time on the site, they will click on less ads, they will earn less money. ”

Facebook’s Pietsch said in his Sunday night statement that the platform depended “on being used in a way that brings people together” to attract advertisers, adding that “protecting our community is more important than maximizing our profits.”

In an internal memo obtained by the New York Times earlier Sunday, Clegg disputed allegations that Facebook contributed to the January 6 riot.

“Social media has had a big impact on society in recent years, and Facebook is often a place where a lot of this debate takes place,” Clegg said in the memo. “So it’s only natural for people to wonder if this is part of the problem. But the idea that Facebook is the main cause of the polarization is not supported by the facts.”

Haugen said that even though “no one at Facebook is malicious … the incentives are misaligned.”

“Facebook makes more money when you consume more content. People like to engage in things that elicit an emotional response,” she said. “And the more they are exposed to anger, the more they interact and the more they consume.”



[ad_2]

Source link