Facebook eliminates nearly 9 million images with children's nudes



[ad_1]

Facebook announced Wednesday that it had eliminated 8.7 million images of users with naked children during the last quarter thanks to software that automatically detects this type of photos.

The machine learning tool implemented last year identifies images containing nudity and a child, which makes it possible to better respect Facebook's ban on displaying images. pictures showing children in a sexualized context.

A similar system also revealed the detection of users attempting to bond sexually with minors, a practice known as grooming or cyberbullying.

Lee: Google, the natural buyer of Twitter and Square?

Antigone Davis, global security officer for Facebook, said Reuters in an interview, the "machine helps us to prioritize" and "more effectively organize" the problematic content of the company's supervisory team, which analyzes the use of the same technology in its Instagram application.

Under pressure from regulators and legislators, Facebook has promised to speed up the elimination of material considered radical and illegal. Machine learning programs that analyze the daily messages of billions of users are critical to the success of their plan.

Machine learning is flawed and news agencies and sponsors are among those who have complained this year about automated Facebook systems that are mistakenly blocking their publications.

Lee: Facebook presents Messenger 4

Davis said that child safety systems make mistakes, but users can appeal.

"We prefer to err on the side of caution with children," he said.

For years, Facebook 's rules have even banned family photos of children with small clothes bred with "good intentions", worried about how others might misuse these images.

We recommend: if you want to download your application, put it on Facebook

[ad_2]
Source link