Apple executive Craig Federighi stoutly defended the company’s decision to check images stored in the cloud against child pornography databases.
In an interview with The Wall Street Journal, Federighi said that Apple will resist any pressure from governments to expand this capability by employing “multiple levels of auditability.”
“If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images,” Federighi said. “This isn’t doing some analysis for, did you have a picture of your child in the bathtub?”
He added, “This is literally only matching on the exact fingerprints of specific known child pornographic images.”
It is reassuring that Apple is sensitive to the backlash from civil liberties groups and is matching hashes, the algorithmic strings that identify a precise image, with a discrete database. PPSA, however, remains concerned that even continuous and rigorous audits by Apple will not catch unauthorized uses of this new capability for nefarious purposes. We worry that Apple may not be able to resist demands from U.S. intelligence and law enforcement agencies to expand the targets of these capabilities. And we worry that when an American company creates a potentially new surveillance capability, that they will be able to resist entreaties by authoritarian regimes with big markets – like China – to hand over the keys.
Apple has long been a leader in digital privacy. To quote the Center for Democracy and Technology: “What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in. It seems so out of step from everything that they had previously been saying and doing.”