[ad_1]
Apple is would be about to announce new photo identification features that will use hashing algorithms to match photo content in user’s photo libraries with known child abuse material, such as child pornography.
Apple’s system will perform on the client – on the user’s device – in the name of privacy, so the iPhone would upload a set of fingerprints depicting illegal content and then verify every photo in the user roll against this list. Presumably, all matches would then be flagged for human review.
Apple has previously said it uses hashing techniques when photos are uploaded to iCloud. This new system would be done on the client side, on the user’s device. Apple has yet to officially announce this new initiative, and the details will be important.
At a high level, this type of system is similar to machine learning features for identifying objects and scenes already present in Apple Photos. The scan takes place on the device and users can enjoy better search functionality.
However, an expert in cryptography and security Matthew Green Notes that the implications of such a deployment are complicated. Hash algorithms are not foolproof and can generate false positives. If Apple allows governments to control the fingerprint content database, then perhaps they could use the system to detect images of things other than clearly illegal children’s content, such as to suppress political activism.
However, note that all photos uploaded to iCloud Photos for backup and sync are not stored end-to-end encrypted anyway. The photos are stored in an encrypted form on Apple’s server farms, but the decryption keys also belong to Apple. This means law enforcement can subpoena Apple and see all of a user’s uploaded photos. (It’s not unusual, all third-party photo services work this way.)
It is possible that in the future, Apple will deploy similar systems to analyze client-side content, which would then be stored on a server in an end-to-end encrypted manner. Many governments have campaigned for such a system from E2E private messaging apps like iMessage and WhatsApp because they fear the growing shift to encrypted communications will make it harder for law enforcement to find and prosecute cases. of child abuse.
Green speculates that Apple wouldn’t have invested in developing this system if applying it to end-to-end encrypted content wasn’t a long-term goal.
FTC: We use automatic affiliate links which generate income. Following.
Check out 9to5Mac on YouTube for more Apple news:
[ad_2]
Source link