Last year, we reported on Apple’s plan to open a digital backdoor on CSAM, or Child Sexual Abuse Material. We reported that a content-flagging system was not just invasive of people’s privacy, but it could open a backdoor for China to use the technology to persecute dissidents and spy on Americans.
Throughout the privacy discussion, the European Union has generally led the world in pushing for higher standards for digital privacy, often challenging the United States to follow its lead.
Now, in the necessary drive to detect and prosecute those who abuse children, the EU Commission is driving a proposal that could result in the scanning of every private message, photo, and video to detect CSAM. It is also proposing using software to seek out adults engaged in “grooming” children to be victimized.
Every decent person agrees that we need to be aggressive in rooting out and prosecuting adults who exploit children. What could go wrong with the EU proposal?
Joe Mullin of the Electronic Frontier Foundation reports that the Commission “wants to open the intimate data of our digital lives up to review by government-approved scanning software, and then checked against databases that maintain image of child abuse.” Private digital conversations, even for Americans, will no longer be truly private.
Problem: The detection software produces far more false positives than catches.
Mullin writes: “Once the EU votes to start running the software on billions more messages, it will lead to millions of more false accusations. These false accusations get forwarded on to law enforcement agencies. At best, they’re wasteful; they also have potential to produce real-world suffering … That is why we shouldn’t waste efforts on actions that are ineffectual and even harmful.”
We would add that PPSA is concerned that technology developed for an admirable purpose is technology that will soon be used for any purpose.