Apple has positioned itself as a protector and advocate of its customers’ privacy. It was a good selling point against its market competitors, who trade freely in their own customers’ data.
But as Cato’s Julian Sanchez writes, that pro-privacy angle took a big hit with Apple’s foray into built-in surveillance. It’s all for the children, of course:
The really dangerous tool is the second one Apple announced—its “CSAM detection” system. This one operates, not by attempting to detect nudity in novel images, but by scanning the user’s Photo Library for matches against a table of “hash values” of known child abuse images maintained by the National Center for Missing and Exploited Children (NCMEC). (A “hash value” is a short string derived by running a larger file through a mathematical algorithm, and routinely used to quickly determine whether two files are identical.) If a certain “threshhold” of matches is reached—indicating a collection of child abuse imagery—the device notifies Apple, which in turn reports the user to NCMEC (and, by extension, the authorities). At least initially, this scan will only run on photos that have been designated for backup to Apple’s iCloud service—which is to say, photos that the user had already chosen to “share” with Apple. This is, however, a design choice rather than a technical limitation: The system could easily be altered to scan all images in the library—and, for that matter, to scan for matches to content other than child abuse images. As with the parental control tool for Messages, this system of “Client Side Scanning” circumvents any encryption that may protect files in transit by running the scans on the device itself, where the files are unencrypted while the device is unlocked.
Who – aside from various creepers, weirdos, and law breakers– could possibly be against a technology that protects kids? Anyone who knows that this tech will never be limited just to stopping child predators:
Described more abstractly and content neutrally, here’s what Apple is implementing: A surveillance program running on the user’s personal device, outside the user’s control, will scan the user’s data for files on a list of prohibited content, and then report to the authorities when it finds a certain amount of content on the list. Once the architecture is in place, it is utterly inevitable that governments around the world will demand its use to search for other kinds of content—and to exert pressure on other device manufacturers to install similar surveillance systems.
Apple is, of course, already under significant government pressure to weaken encryption for the convenience of law enforcement—and this announcement is doubtless an attempt to relieve that pressure by demonstrating that the company is dedicated to combatting a particularly loathsome misuse of its products.
Apple, then, appears to be offering the state an olive branch – or diversion – in hopes it will not ask for even more access to users’ devices, for a host of reasons unrelated to criminal activity. Apple has diced to play a game it long resisted, and will ultimately lose:
The core question is whether we wish to normalize the sale of personal computing devices that come preinstalled with spyware outside the control of the user and owner, however noble the purpose to which that spyware is initially put. The answer free societies have given to that question for the past five decades is the right one: No.