New report: grooming, sexual blackmail of children and images like …



[ad_1]

Images of sexual assault on children produced by grooming or sexual extortion are very common in police investigations and the number of photos increases. The images that children voluntarily take, but end up in collections of material on sexual abuse, are even more common. This type of images is greatly increased in police investigations. It shows the NetClean 2018 report, a report on documented sexual abuse of children, which is published today, November 28, 2018. In the report, police from 30 countries responded to questions about child pornography investigations.

In the report, 80% of police officers report that images produced as a result of a band-aid or sexual extortion are common or very common in their investigations. More than 50% of police say that this type of equipment is increasing. More than 90% of police members said that spontaneous self-produced photos and movies that children perform without putting themselves and post or send to someone are common or very common. As many (90%) say that this type of material is increasing.

"Voluntary images can, for example, involve cut-out images that young people send to a friend or girlfriend or for whom they post for pleasure, but without thinking about the use that is made of them", explains Anna Borgström, CEO of NetClean.

The report shows that children in the photos are generally between 8 and 16 years old and sexual extortions between 11 and 16 years old, but there are also younger children. One-third of the police reported seeing children in a 5- to 7-year-old grooming business and 16% of police said they had seen children under five years old. In cases of sexual extortion, one in ten police officers report seeing children aged 5 to 7 years old and 7% of them have already seen children under five years old.

"We are a little surprised that there are so young children in these cases, very young children are common in other types of child sexual abuse, but as these contacts are Internet-based, children need to read and write, and it is likely that younger children will be involved in older siblings who are being treated or exposed to sexual blackmail, "says Anna Borgström.

The NetClean 2018 report also investigated several other issues related to documented sexual abuse of children. The following is a brief summary (the last reported point comes from a survey of companies, not the police):

  • Organized writers: 85% of the police met organized forums and groups of authors in their surveys. Groups can contain thousands or hundreds of thousands of people.
  • Cryptovalists: The majority of police officers have not seen the use of cryptovalers linked to child pornography. When this happens, it is often linked to other types of crime, such as drug offenses or fraud.
  • Prisoners who manipulate and hide images: About 40% of police say that it is common for attackers to manipulate images to hide, for example, identities or objects, or to try to hide image files in different ways . In particular, encryption is common.
  • Technological Deepfakes: One in five police officers found deepfakes in his investigations. Deepfakes means that you can change your face by moving material on another person's face with the help of artificial intelligence technology.
  • One in 500 employees commit child pornography by viewing educational materials on their work computer.

Read the full report here

Click here to download a summary of the NetClean 2018 report

For more information, please contact: Anna Creutz, NetClean Communications Manager,
0703-08 10 77, [email protected]

NetClean is the world leader in technical solutions to stop the spread of sexual abuse content on children. Multinational companies, authorities and ISPs around the world use our technology.

We are experts in the detection of child sexual abuse and how companies can protect themselves against this type of crime. The use of our solutions prevents the spread of material of sexual abuse on children. It also helps to identify people potentially sexually interested in children, thus saving them from sexual abuse.

NetClean is part of the Safer Society Group, which is developing technology companies to create a safer and smarter society.

[ad_2]
Source link