Apple Tries to Calm Blowback Against Intruding on iPhone Users’ Privacy | WHAT REALLY HAPPENED X-Frame-Options: SAMEORIGIN

Apple Tries to Calm Blowback Against Intruding on iPhone Users’ Privacy

Recently, Apple announced a new addition to its upcoming iOS 15 and iPadOS 15 firmware for iPhones and iPads. The new feature will allow Apple to scan user photos stored in Apple’s iCloud service and determine if they contain sexually explicit images involving children. Following a blowback against the Masters of the Universe scanning the devices of its customers, the company is now promising it will not abuse the feature or allow governments to dictate what types of data iPhones are scanned for.

Apple claims that the way it detects CSAM (Child Sexual Abuse Material) is “designed with user privacy in mind,” and it is not directly accessing iCloud users’ photos but rather utilizing a device-local, hash-based lookup and matching system to cross-reference the hashes of user photos with the hashes of known CSAM.

Webmaster's Commentary: 

So, all the perverts need to do is slightly resize their images, and the hashing won't work.

Comments

SHARE THIS ARTICLE WITH YOUR SOCIAL MEDIA