Worrying move by apple - stop using iCloud photos
Apple is seemingly gearing up to introduce a client side tool that would scan iPhones to identify child sexual abuse material (CSAM) on users’ phones. The move will seemingly encode strategic identifiers that would indicate CSAM from Apple’s end, and will hash these identifiers to run searchers on users’ iPhone photos on iCloud. This search will then return results based on the number of matches that Apple would find on a phone, and if it finds too many matches, a prompt of the result would be returned to Apple’s servers.
While people with such pictures are a menace to society and must be put behind bars scanning every phone based on an algorithm is not such a good idea. What is peoples private pictures get into hands of law enforcement agencies and blackmailers. It can lead to major major privacy breaches.
So the best solution for apple users who are worried is to not use iCloud photos more
Shikha you've disappointed me in the past but this time you're coming from the right of the bed. Since, iCloud storage is not encrypted, no one should ever use it in the very first place. You cannot expect multi-billion companies to be the vanguard of your personal privacy. You've to fight for it. more
She is probably afraid of getting caught. more
Yeah well Google will follow suit, also they're doing it country by country in accordance with local law, our number may come really at the end most probably more
Does Indian law permit it? more
Why have any offensive photo on your phone? In any case, options like One Drive/Dropbox etc allow all phone users to keep, among others, all photos after encryption in the cloud only. You can download any of them anytime on your phone, pc etc more
Not using iCloud photos is not a solution. The algorithm will scan iPhones. So don't use iPhones is the only solution more