YouTube will ask commenters to rethink the post if their post seems offensive



[ad_1]

YouTube tries to combat offensive comments that appear below videos by following in the footsteps of other social media companies and asking people before posting anything that might be offensive: “Is this something you really want to share. ? “

The company is launching a new product feature that will warn people when they post a comment that it “may be offensive to others,” to give them “a chance to think before posting,” according to a new blog Post. The tool will not actually prevent people from posting said comment. The prompts won’t appear before each comment, but they will appear for those that the YouTube system deems offensive, based on content that has been flagged multiple times. Once the prompt appears, users can post the comment as they originally planned or use more time to edit the comment.

For creators, the company is also rolling out better content filtering systems in YouTube Studio (the backend where creators run their channel). The new filter will search for inappropriate or hurtful comments that have been automatically flagged and taken for review, and remove them from the queue so users don’t have to read them. The new feature will be rolled out first on Android and in English before appearing elsewhere.

There is no doubt that YouTube has a problem with hurtful comments on the site, but one of the biggest issues is hateful comments. Thanks to automatic filtering, the company has removed more than 46 times as many hate speech comments daily since the start of 2019 than ever before, according to YouTube. Then there are the videos. YouTube claims that of the 1.8 million channels closed last quarter, more than 54,000 were due to hate speech. This was the highest number of hate speech content bans in a single quarter that YouTube has seen, and three times higher than in early 2019 when new hate speech policies entered. in force.

YouTube is also trying to tackle other issues affecting creators, including issues of monetization, bias, burnout, and channel growth. To better understand the impact of different communities, the company will begin asking YouTubers to voluntarily provide information about their gender, sexual orientation, race and ethnicity starting in 2021.

The goal is to use the data to identify how different communities are being treated both in terms of discovery on the platform and in terms of monetization. The LGBTQ creator community has consistently said that YouTube’s systems automatically demonetize their content or hide their videos, and they have publicly fought against the treatment they receive. YouTube teams also want to use the data to find “possible patterns of hatred, harassment and discrimination.”

One of the biggest questions is how this data will be used and stored once it is collected. The YouTube blog post says the survey will describe how the information will be applied to the company’s research and what control creators maintain over their data. The blog post as it is doesn’t say so now. Instead, the company declares that the information will not be used for advertising purposes. Users will also retain the ability to unsubscribe and delete their information at any time.

“If we find problems in our systems that affect specific communities, we commit to working to resolve them,” the blog post read.

There is no current timeline for the deployment of the surveys, but more information on the project will be released in early 2021.

[ad_2]

Source link