'Privacy Company' Apple Plans To Monitor All US iPhones For Evidence Of Child Porn | WHAT REALLY HAPPENED X-Frame-Options: SAMEORIGIN

'Privacy Company' Apple Plans To Monitor All US iPhones For Evidence Of Child Porn

As the old saying goes: If you aren't doing anything illegal, then you have nothing to fear from surveillance.

Smartphones already act like tracking devices broadcasting the whereabouts of their owners, but Apple is about to open the door to far more advanced forms of smartphone-based voluntary surveillance by launching a new program designed to detect and report iPhone users who are found to have child pornography - known by the academic-speak acronym CSAM - which stands for Child Sexual Abuse Materials. According to a handful of academics who were offered a sneak preview of the company's plans - then promptly spilled the beans on Twitter, and in interviews with the press.

The new system, called "neuralMatch", is expected to be unveiled by Apple later this week. The software is expected to be installed on American iPhones via a software update. According to the FT, the automated system can proactively alert a team of human reviewers if it believes CSAM is present on a user's iPhone. If the reviewers can verify the material, law enforcement will be contacted.

This is how "neuralMatch" will work, per the FT:

Apple's neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse.