[ad_1]
Technology
On Wednesday, YouTube banned videos promoting white supremacy, Nazism and other ideologies reinforcing bigotry, as well as those denying the occurrence of violent events such as the Holocaust or the shooting at elementary school Sandy Hook.
The ban expands the existing rules on anti-hate speech "by specifically banning videos alleging the superiority of a group in order to justify discrimination, segregation or exclusion based on qualities such as the" anti-hate speech ". age, sex, race, caste, religion, sexual orientation or veteran status, "The Google subsidiary said in a blog post.
History continues below
The ban will result in the removal of thousands of videos already online, said a spokesman.
"It is our responsibility to prevent our platform from being used to incite hatred, harassment, discrimination and violence," the blog says. "We are determined to take the necessary steps to assume this responsibility today, tomorrow and in the years to come."
YouTube also reinforces the previously announced changes to its video recommendations, aiming to direct users to fewer "boundary" videos that spread offensive language or hoaxes without breaking the rules. The company hopes to make these exclusive US changes to other countries by the end of the year, he added.
Channels that regularly broadcast videos at the limit will no longer be able to run ads or monetize their content, YouTube added.
The move comes as YouTube and other tech giants, such as Facebook and Twitter, are failing in Washington and across the country for stumbling in efforts to stem the spread of hatred. Companies have spent years, for example, wondering whether the right-wing provocateur and conspiracy theorist, Alex Jones, were breaking their rules by ordering, for example, his supporters to attack the parents of children killed in the shooting at Sandy Hook. Facebook finally banned Jones last month as part of a wider crackdown on hate speech following a series of suspensions and deletions of pages affiliated with Jones.
Yet these initiatives have only led to more and more difficult political struggles on Internet platforms as modern public places. Democrats are often outraged by their perceived inaction on incitement to hatred, while Republicans blame companies for what they see as a crippling effect of the content moderation decision on freedom of speech.
"I'm not a fan of Jones," Senator Ted Cruz (R-Texas) tweeted following an earlier suspension of Jones last year, "but who has made Facebook the arbiter of political speech?"
Just last week, YouTube has been at the center of two high-profile controversies over what critics say are failures in potentially harmful content.
Vox reporter Carlos Maza became viral last week with a Twitter feed Steven Crowder, a conservative humorist, publishes YouTube videos mocking Maza with homophobic and anti-Hispanic insults. YouTube on tuesday m said he will not take any action against Crowder because his videos, while containing "clearly offensive language," do not openly break the rules.
At the same time, the New York Times reported Monday that despite the company's earlier efforts to prevent pedophiles from watching YouTube, some children's videos dressed in uniform are reaching hundreds of thousands of views, exacerbated by the launch. YouTube video recommendation engine. The company said in a blog post that it was beginning to downplay the visibility of these videos, but would not stop recommending them altogether.
Senator Brian Schatz (D-Hawaii) on a Tuesday tweet The link to the Times' story, which he described as "disgusting", illustrates the political pressures being exerted on YouTube and other tech giants for them to better monitor what is happening on their platforms.
"The algorithms are amoral and the platforms need human supervision," he said. "If they are not worried about commercial or moral consequences, they should at least worry about the anger of the decision makers."
Alexandra S. Levine contributed to this report.
[ad_2]
Source link