Facebook uses AI to reduce naked and child exploitation images



[ad_1]

fb

Facebook uses AI to combat the exploitation of children.

SOPA Images

Child exploitation continues to be a problem on social media.

Facebook uses artificial intelligence and machine learning to proactively detect child nudity and exploitation content when they are downloaded, the company said Wednesday in an article detailing its efforts. This comes to add to the photo matching technology, which Facebook has been using for years to prevent the sharing of known child exploitation images.

In the last quarter, Facebook removed 8.7 million content from its platform that violated the rules on child nudity or the sexual exploitation of children, including non-sexual content such as sex. 39, a picture of a child in the bath. The company also said that it was removing accounts that promote this type of content.

Facebook has an age requirement of 13 years and over and limits the number of people with whom teens can interact after their registration. By using AI, the company can more quickly identify the exploiting content, report it to the National Center for Missing and Exploited Children, and search for users who might have inappropriate interactions with Facebook kids.

However, moderation of content is not always easy. In August, Facebook had problems after removing a photo showing naked and emaciated children from a Nazi concentration camp. In 2016, the social network unveiled the story with a picture of a naked Vietnamese girl fleeing a napalm attack, Pulitzer Prize winner.

Next month, Facebook announced plans to create tools for small businesses to prevent child exploitation with Microsoft and other industry partners.

The 5G is your next big upgrade: everything you need to know about the 5G revolution.

NASA is 60 years old: the space agency has pushed humanity further than anyone and plans to go further.

[ad_2]
Source link