We commend Apple for delaying the rollout of its iPhone update to scan images and compare them to a database of Child Sexual Abuse Materials (CSAM). While everyone recognizes that Apple’s motives are commendable, there are more than a few devils in its details.
Apple pledged to use digital tools that would compare encrypted images stored in the cloud to CSAM databases in a way that not even Apple could use to access. Apple also pledged to have a rigorous auditing process. An excerpt from TechCrunch explains the hidden dangers with this approach: [S]ecurity experts and privacy advocates have expressed concern that the system could be abused by highly resourced actors, like governments, to implicate innocent victims or to manipulate the system to detect other materials that authoritarian nation states find objectionable. TechCrunch also reports that researchers were able to create the means to trick the system into thinking two entirely different images were the same. Under this scenario, it is easy to imagine that a repressive foreign government could silence a critic in the United States or elsewhere by framing him or her as a collector of child pornography. Over the next few months, Apple, outside experts and civil liberties organizations should join forces to look for ways to weed out illegal images without opening a backdoor into consumer accounts. Comments are closed.
|
Categories
All
|