[ad_1]
When Apple announced changes it planned to make on iOS devices in an effort to help reduce child abuse by finding child sexual abuse material (CSAM), parts of its plan went into effect. generated a backlash.
Including, according to what has been published by the technical site “the verege”, first of all, an update of the search and voice assistant application Siri is released on iOS 15, watchOS 8, iPadOS 15 and macOS Monterey, when the user searches for topics related to child sexual abuse, Apple will direct the user to resources to report CSAM or get help attracting that content.
But Apple’s other two CSAM plans have come under fire, and an update will add a parental control option to messages, send an alert to parents if a child 12 or younger views or sends sexually explicit photos, and block them. photos for all users under the age of 18.
More controversial is Apple’s plan to scan the photos on the device to find CSAM before uploading them to iCloud, and report it to Apple administrators who can then upload the photos to the National Center for Missing and Exploited Children (NCMEC). Whether there is any potential The feature will protect users while still allowing the company to find illegal content, and many Apple critics and privacy advocates claim this provision is essentially a security backdoor, a clear contradiction to Apple’s long-standing commitment to user privacy.
Source link