Project for Privacy and Surveillance Accountability (PPSA)
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • PRESS Act
    • Fourth Amendment Is Not For Sale Act
    • Over 3 Million Searches
  • Issues
  • Solutions
  • SCORECARD
    • Congressional Scorecard Rubric
  • News
  • About
  • TAKE ACTION
    • PRESS Act
    • Fourth Amendment Is Not For Sale Act
    • Over 3 Million Searches

 NEWS & UPDATES

Is Apple Opening a Backdoor for Dictators and Overzealous Law Enforcement?

8/13/2021

 
Apple Privacy
When Apple unveiled new technology tools to combat child sexual abuse, the move was met with almost universal criticism and concern from civil liberties groups, from the Electronic Frontier Foundation, to the Center for Democracy and Technology, to the ACLU.
 
Apple plans to scan images stored by iPhones, iPads and Macs in the cloud to enable a machine-learning algorithm to compare those images against others contained in the databases of the National Center for Missing & Exploited Children. It also plans to give parents the option to be informed if their child under 13 sends or receives explicit images.
 
Apple’s reasons for acting are commendable. Nobody will rush to defend producers or consumers of child pornography or others who abuse children. But Apple’s approach to searching private data extends far beyond such bad actors, despite Apple’s unenforceable promise to not search for other disfavored images or data, or otherwise exploit its new methods for rummaging through your phones, iPads, and other devices.
 
Because devices and the cloud are deeply integrated, Apple will be of necessity reaching into our phones and other devices, as well as our personal data stored on the cloud. Any time a company deploys surveillance tools that governments, institutions, and bad actors would love to exploit, there’s cause for concern about unintended consequences. It’s unknown whether Apple’s new image scanning technology can withstand the tests of overreach and criminal intent.
 
Consider Jennifer Granick, surveillance and cybersecurity counsel for the ACLU’s Speech, Privacy, and Technology Project, who told Gizmodo:
 
“However altruistic its motives, Apple has built an infrastructure that could be subverted for widespread surveillance of the conversations and information we keep on our phones. The CSAM (Child Sexual Abuse Material) scanning capability could be repurposed for censorship or for identification and reporting of content that is not illegal depending on what hashes the company decides to, or is forced to, include in the matching database. For this and other reasons, it is also susceptible to abuse by autocrats abroad, by overzealous government officials at home, or even by the company itself.”
 
The potential for illicit uses by state and non-state actors is staggering: 

  • Governments that outlaw homosexuality might use that technology to restrict apparent LBGTQ content and to persecute people.

  • Bad actors could use the technology to identify and censor art, satirical images, protest flyers, counter-speech, and organized opposition to oppression – or use information to frame innocent people deemed political or personal enemies or even just nuisances.

  • Human rights violations and war crimes photos could escape documentation or be flagged and deleted forever.
    ​
  • The rights to free speech/expression could ultimately be undermined if the U.S. government takes this loophole and, as it has done so many times before, widens it to the size of the J. Edgar Hoover Building.
 
Can Apple’s technology truly differentiate between dangerous content and art, memes or other legal images? The Center for Democracy and Technology (CDT) found that surveillance technologies are notoriously error-prone and difficult to audit. And even if supposedly limited to known images, that limit still depends on the whims of Apple in not adding a broader set of images or general search criteria.
 
Consider the 2018 case of Tumblr’s faulty algorithm meant to filter adult content, which flagged images of puppies, Garfield and Big Bird, among others, as explicit content. In another case, technology originally built to scan and hash child sex abuse imagery was repurposed to create a database of “terrorist” content. What about fentanyl dealers? Gangs? White collar criminals? Folks who are pro-abortion? Or anti-abortion? Never Trumpers? Trump supporters? Once the door is open, the temptation to breach it for other supposedly altruistic, or fashionable, purposes is almost endless.
 
Apple is on the brink of breaking its own oft-proclaimed promises of encryption and “deep commitment to user privacy.” By all appearances, the company is giving itself more knowledge of customer content and seems poised to use semantics to gloss over this dramatic departure from its stance on privacy. Why? 
 
Facebook last year reported more than 20 million cases to the Center for Missing & Exploited Children. Apple reported 265 cases. It seems Apple is, at least in part, acting out of concern that iCloud not be used as a resource for pedophiles and human traffickers. The phone company might say much the same as an excuse to wiretap and monitor all phone calls for troubling key words. The problem is, as the Center for Democracy and Technology put it, the “mechanism that will allow Apple to scan images in iMessages is not an alternative to a backdoor – it is a backdoor.”
 
Tim Cook and his leadership team should reconsider this roll-out or at least refine their approach. Otherwise, Apple could unleash unintended consequences that will hurt that brand, as well as democracy. As Machiavelli said, “Learn the way to hell in order to steer clear of it.” 

Comments are closed.

    Categories

    All
    2022 Year In Review
    Analysis
    Call To Action
    Congress
    Congressional Hearings
    Congressional Unmasking
    Court Hearings
    Court Rulings
    Digital Privacy
    Facial Recognition
    FISA
    FOIA Requests
    Fourth Amendment Is Not For Sale Act
    Government Surveillance
    Insights
    In The Media
    Lawsuits
    Legislation
    News
    Opinion
    Podcast
    PPSA Amicus Briefs
    Private Data Brokers
    SCOTUS
    SCOTUS Rulings
    Spyware
    Stingrays
    Surveillance Issues
    Surveillance Technology

    RSS Feed

© COPYRIGHT 2022. ALL RIGHTS RESERVED. | PRIVACY STATEMENT