YouTube takes more and more position on "hateful and supremacist content"



[ad_1]

After another controversy related to the potential exploitation of children on the platform, YouTube announced that it will now adopt a firmer and more definitive stance against "hate and supremacy" content on its website. platform, banning all videos that promote discrimination or exclusion. a specific group.

As explained by YouTube:

"Today, we are taking another step in our hate speech policy specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities such as age, bad, race, caste, religion, Sexual orientation or the status of the veteran. This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory. "

This is a major decision that will affect thousands of YouTube channels. In addition, different channels will find themselves caught between two fires, triggering an avalanche of challenges and calls, which could take some time to the YouTube team.

Concrete example – journalist Ford Fischer As a result, his channel has already been demonetized, despite his videos documenting activism from a journalistic point of view.

A few minutes after @YoutubeAt the announcement of a new purge, it seems that they have caught my exit, which documents activism and extremism, in the crossfire.

I have just been informed that my entire channel has been demonetized. I am a journalist whose work is used in dozens of documentaries. pic.twitter.com/HscG2S4dWh

– Ford Fischer (@FordFischer) June 5, 2019

No doubt there will be a large number of cases like this – which, as noted above, could take a lot of time on YouTube.

In fact, YouTube recognizes that some content will be categorized incorrectly in order to enforce this new difficult position:

"We recognize that some of this content is valuable to researchers and NGOs seeking to understand hatred in order to combat it, and we are exploring options to make it available in the future."

Also on the YouTube Prohibited List – content that denies well-documented events:

"Finally, we will remove content that denies that well-documented violent events, such as the Holocaust or shooting at Sandy Hook Elementary School, have occurred. "

The announcement responds to various concerns that have come up with YouTube in recent times.

In January, YouTube announced changes to its referral algorithm after reports revealed that its system was promoting conspiracy theories and fake news content, amplifying such misinformation.

YouTube recommendation list for videos on conspiracy theory

As noted, YouTube has also been involved in investigations into the alleged dissemination of child-directed video clips shared by online pedophile groups via the platform.

These are major areas of concern, and an important part of the problem in each case is the YouTube algorithm, which shows users more of what they like. But because machine learning systems have no capacity for judgment, their content does not matter.

If you like children's videos, YouTube will show you more, which is a logical but potentially disturbing and dangerous process. YouTube may be looking to reduce the impact of its algorithm, or even completely remove its algorithm recommendations, leaving users with the choice of what they would look for, but it would also reduce the time spent by them. users on their platform. So, if this is not an option, what is the next approach?

The application of more content rules will also address, in theory, the core of the problem, while allowing YouTube to continue to benefit from its algorithmic process. And it should be noted, the YouTube recommendations system currently leads up to 70% of his videos watched.

The main problem that has led to the amplification of this material is the algorithm of YouTube, but given its commitment value, YouTube has been forced to take a different angle. And the impacts of change will be important.

This does not mean that such bans should not happen, all that can be done to reduce the number of hate speech and online divide is a good thing. But the question will be where does YouTube draw the line and what does it mean for online freedom?

Of course, banned broadcasters risk splitting in half and transmitting their messages to other platforms. Maybe DailyMotion or Vimeo will be more accommodating with their views, or maybe they will take them to a private group on Facebook or on Instagram's IGTV, or to another venue allowing them to continue to monetize their content.

Perhaps the most prominent creators will be working on their own video hosting solutions on their own websites or could they all come together with people like Alex Jones to create their own network. YouTube probably does not worry excessively about this possible scenario, but the problem is that these users already have a large number of subscribers and can still share their views on other platforms and by d & # 39; Other means.

Of course, YouTube can only do a lot, it can only eliminate that type of content platforms it controls. And while it 's certainly going to ease concerns about YouTube' s more controversial elements, it opens the door to increased action, users now having a bigger window to complain about discrimination and more. abuse, which may make it harder to comply with YouTube's policies. .

In essence, the new YouTube position is a good thing. Such views and perspectives should not be subject to amplification or monetization – which is actually exploitative in many cases. But the margins around what YouTube considers acceptable have been drastically reduced, which also makes the gray areas much darker.

Will it be good for an open internet as a whole? Will it reduce or exacerbate the problem in other ways?

Only time will tell.

[ad_2]
Source link