Facebook excuses for ad category targeting "white supremacists"



[ad_1]

Just days after the mbad shooting of Jewish Jews in Pittsburg in what has been described as the worst anti-Semitic hate crime ever recorded by the United States, the online news publication The Intercept discovered that it could place an ad on Facebook appealing to white supremacists.

The publication said it was able to select "white genocide conspiracy theory" as a predefined "detailed targeting" criterion on Facebook to promote two articles to a group of 168,000 people "who protested their interest. [in] or as pages linked to the theory of the white genocide plot ".

The paid promotion has been approved by the Facebook advertising service.

The Intercept, which claims to follow the investigative reporting protocols used by ProPublica, a website created to "expose abuses of power," contacted the company for comment.

A day after Facebook confirmed the purchase of The Intercept's "White Genocide" ad, the company removed the category and canceled the ads.

Facebook spokesman Joe Osborne said: "This targeting option has been removed and we have removed these ads, which is contrary to our advertising principles and has not been used. should never have been in our system and we apologize for that mistake. "

Mr. Osborne added that the category "white genocide conspiracy theory" was "generated by both automated badysis and human badysis, but newly added interests are finally approved by users.

"We are ultimately responsible for the segments we make available in our systems."

He also confirmed that the ad category was used by marketers, but cited only "reasonable" ad purchases targeting "white genocide" enthusiasts, such as media coverage.

However, The Intercept says that a basic search of Facebook groups still recovers "tens of thousands" of users interested in the "white genocide" and the "hate-based content" that are found via the platform.

Ironically, shortly after apologizing for the "white genocide" ads, Facebook had to apologize for blocking an anti-abortion advertisement supporting Republican Senate candidate Marsha Blackburn after the social media platform has been accused of bias and censorship free political speech.

This is happening as Facebook and other social media platforms combat online misinformation and hate speech for two years.

A few days before the mid-term elections in the United States, there are signs that they are progressing, even though they are still very far from having won the war against misinformation.

Some even claim that it is easy to deliberately flood social networks with misinformation, an unintended consequence of their willingness to meet the needs of advertisers by categorizing the interests of their users.

The tech giants have thrown millions of dollars, tens of thousands of people and turned out to be their best technical efforts to fight against false information, propaganda and hatred. which has proliferated on their digital platforms.

Facebook, in particular, has seen a major shift since the end of 2016, when its CEO, Mark Zuckerberg, infamously rejected the idea that false news about his service could have made the election possible. someone crazy".

But the false news remains huge and could spread to new audiences. A team led by Philip Howard, principal investigator of Oxford's computer propaganda effort, reviewed the stories shared on Twitter in the last 10 days of September 2018 and discovered that what she called "unwanted information" accounted for a full quarter of the shared links during that time – more than the number of shared professional stories during that time.

Reducing misinformation, of course, is anything but easy. Opponents always find new ways to get around the restrictions.

It can also be difficult to distinguish misinformation and propaganda from legitimate information, especially when world leaders such as President Donald Trump routinely disseminate false information on social media.

Some critics argue that the highly advertising-based business model that has enriched Zuckerberg is also perfectly suited to propagandists.

Services like Facebook and Twitter "support each other by finding like-minded groups and selling information about their behavior," wrote Dipayan Ghosh, former Facebook privacy policy expert and Ben Scott, senior advisor to New America, in an editorial of Time Magazine. earlier this year.

"The propagators of misinformation support each other by manipulating the behavior of animated groups of the same ideas."

[ad_2]
Source link