YouTube would have discouraged employees from reporting toxic content



[ad_1]

For years, YouTube has ignored calls from its employees to process and remove toxic videos to reinforce their engagement, reports said. Bloomberg. According to more than 20 former and current YouTube staff, employees would make proposals to limit the broadcast of videos containing disturbing and extremist content and / or conspiracy theories, but the executives would be more interested in strengthening the video. Only by warnings.

One proposal offered a way to keep content "close to the line" of policy violation on the platform, but to remove it from the recommended tab. YouTube rejected this suggestion in 2016, said a former engineer, before continuing to recommend videos, even though they were controversial. According to employees, the internal goal was to reach 1 billion hours of viewing per day.

"I can say with great confidence that they were deeply mistaken," said the engineer Bloomberg. (In January 2019, YouTube adopted a policy similar to the one it initially suggested.)

Employees outside the moderation team would also have been discouraged from searching toxic videos on YouTube. The lawyers said the company's responsibility would be greater if it was proven that staff members knew and recognized the existence of these videos.

At least five executives left YouTube, reluctant to want to solve the problem. As another former employee described, Susan Wojcicki, CEO of YouTube, would "never put her finger on the scales," claiming that she simply thought "run the business" instead of doing facing the assault of misinformation and dangerous content. A spokesperson for YouTube said the company had begun taking action by the end of 2016 and demonetising channels that promoted harmful content in 2017. However, until the end of this year, less of 20 people were employed in his "trust and safety" team. You can read the full Bloomberg Refer here for more anecdotes about the difficulties faced by staff members in preventing controversial videos from becoming viral.

In 2018, YouTube attempted to prevent false news and conspiracies from spreading on its platform with an information box. This year, he started removing ads from potentially harmful content. Nevertheless, even though YouTube may prevent the release of controversial videos, it will eventually have to address the fundamental problem of content moderation, as toxic content remains prevalent on the site.

[ad_2]

Source link