Child abuse: Apple urged to rapidly deploy image scanning tool | Global development



[ad_1]

Child protection experts around the world have called on Apple to urgently implement new scanning technologies to detect images of child abuse.

In August, Apple announced plans to use a tool called neuralMatch to scan photos uploaded to iCloud online storage and compare them to a database of known child abuse images.

However, the tech company has since said it is suspending the rollout after heavy lobbying from privacy activists who raised concerns about the potential abuse of neuralMatch by governments. who they believe could use the tool to increase oversight of private citizens.

Ross Anderson, professor of security engineering at the University of Cambridge and the University of Edinburgh, wrote: “Protecting children online is an urgent issue, but this proposal will do little to help. prevent these appalling crimes, while opening the floodgates for significant expansion of the surveillance state. . “

This week, child protection agencies, including the NSPCC, the National Center for Missing and Exploited Children (NCMEC) and the United Nations Special Rapporteur on the Sale and Sexual Exploitation of Children, released a statement joint endorsing neuralMatch and stating that “the time is in essence” to use new technology to help protect children from online exploitation and abuse.

“Concerns that such technology is a ‘slippery slope’ to surveillance remain speculative and do not justify dismissing an opportunity for progress that would allow the thousands of victims and survivors of sexual abuse whose images circulate online to be protected from re-victimization and re-traumatization, “the groups said in the statement.” Instead, we should work together to ensure that appropriate safeguards, checks and balances are in place. “

Recirculated images of abuse are one of the major challenges for law enforcement and child welfare agencies around the world. Police figures show the UK database of known child abuse images contains 17 million unique entries and grows by 500,000 images every two months.

Scanning technologies aim to continuously analyze and match data to these images – using a technique called hashing that identifies particular images – so that when shared online, they can be used to detect and stop offenders. Child abuse can be wiped off the internet for good, ending a never-ending cycle of revictimization, advocates say.

Apple has said it will be looking for known images. If a strong enough match is flagged by the scanning technology, staff will manually review the reported images and, if child abuse is confirmed, the user’s account will be deactivated and NCMEC notified.

Apple software chief Craig Federighi told the Wall Street Journal he believes the technology has been misunderstood. He stressed that the tools could only seek to match known images of child abuse – not general images of children.

Iain Drennan, Executive Director of WeProtect Global Alliance, an organization that combats online child sexual abuse and exploitation, said: “Balancing privacy and protecting children is not easy, and it was therefore extremely encouraging to see Apple recognize and take up this challenge.

“We owe it to victims and survivors to identify and remove records of their sexual abuse as quickly as possible,” he said.

[ad_2]

Source link