Recently, Apple announced that it will deploy a new algorithm, NeuralMatch, to monitor iMessages and images on its devices. The ostensible purpose is to scan for photos containing nudity sent by or to children and also for photos of nude or seminude children. If a number of suspect images are backed up to an iCloud account, they will be decrypted and inspected, and the user reported to law enforcement. Police could then investigate or prosecute the user for possession of child pornography or for child sex abuse or other offenses. But, well-intentioned as the motive may sound, this is a matter of grave concern for privacy. Once such surveillance is begun, it opens the gates for other tech firms to follow suit, and worse, for warrantless scans for nefarious government purposes.
Coming as it does from Apple, this is a curious development in the U.S. For although Apple has bent over backward to please the Chinese government on its surveillance and censorship needs, it has vehemently resisted assisting the U.S. government. It has refused to unlock cellphones for criminal investigations and prosecutions, citing concerns about protecting the data and privacy of its customers. It has received — and objected to — at least 10 requests from federal courts for extracting data from locked iPhones. But now, in a complete turnabout, if Apple thinks (or its algorithm decides) that certain images are illegal, it will cooperate with the authorities.