[ad_1]
On Monday, YouTube announced a handful of rule changes that affect how underage miners use the platform. Children under the age of 13 will no longer be able to follow the YouTube feed without an adult, in order to "limit the risk of exploitation." The company owned by Google, which already prohibits young children from having their own accounts, will also limit the platform. The platform recommends videos "presenting minors in risky situations," an initiative that, according to will support "new classifiers in artificial intelligence" to identify this content.
This follows a shocking article in The New York Times that described how YouTube's recommendation algorithm essentially consisted of just-dressed children's video channels, expanding their reach and making them easily accessible to pedophiles. In one case, a 10-year-old Brazilian girl posted a video in which she plays with her friend in a swimsuit; once promoted by YouTube, the video was viewed 400,000 times.
Young user security has been a problem on YouTube – and social media in general – for years, but YouTube's attempts to address this issue may be less familiar than the work done by Facebook and Twitter to handle content reprehensible. YouTube's efforts illustrate how difficult it is to isolate children from inappropriate behavior on the Web, whether through software or human moderation, especially YouTube's massive scale.
February 2015: YouTube introduces YouTube Kids, supposed to be a safe version of the platform for young users. It includes parental controls, educational content and limitations on what can be downloaded and searched. However, since then, the child service bubble has been punctuated with content on the theme of conspiracy, or worse.
November 2017: In what has become known as the "Elsagate controversy," videos that seem child-friendly on YouTube Kids turned out to be troubling scenes. In one of the most notorious examples, Elsa of the movie frozen is impregnated by Spider-Man. The controversy has led to stricter rules on how to download and monetize videos featuring children's characters and remove channels and videos violating these rules.
February 2019: In the midst of a controversy in which pedophiles have invaded the commentary sections of children's videos, several major advertisers, including Coca-Cola, Amazon and Disney, have distinguished themselves from YouTube.
March 2019: YouTube has disabled comments on many videos featuring kids. Although the company acknowledged that this decision would disappoint some creators whose comment sections favored productive discussions, "we also know that it's the right thing to do to protect the YouTube community," said one spokesperson for the platform at the Verge..
As another incident this week shows, YouTube faces abusive, abusive, abusive, and otherwise inappropriate content. On Wednesday, YouTube announced that it would remove thousands of hateful and extreme videos while tweaking the platform so that it "spurs authoritative content, reducing the release of limiting content and rewarding trusted creators." ". Demonetize videos of a great right-wing creator who has repeatedly insulted the sexual orientation and Cuban-American heritage of Carlos Maza, a journalist at Vox. Once again, a platform used by 2 billion people plays a well-known child's play: whack-a-mole.
[ad_2]
Source link