When it comes to digital privacy, Americans feel like a well-dressed person caught in the rain without an umbrella. At first, you try to wait it out under an eave. Then you accept getting a little bit wet. Finally, when your clothes are thoroughly soaked, you just give up.
When it comes to digital privacy, Americans have long accepted we couldn’t get any wetter. The social media services and apps we use track and sell our location history, our contacts, our communications, our purchases and (most revealing) our web searches. These data points, like the dots in a pointillistic painting, create a portrait of users with great detail. These portraits are then sold by data brokers to government agencies and commercial entities.
A recent Apple commercial portrayed this process by putting a young woman’s virtual self on an auction block. In the ad, the heroine Ellie turns on Apple’s privacy devices, vaporizing her would-be auctioneers. But such controls on a smartphone only involve a small portion of the torrents of information that are collected about us and sold wholesale.
So just when many are ready to declare the death of privacy, a bicameral, bipartisan group of legislators have put forward a discussion draft of the American Data Privacy and Protection Act (ADPPA). In a House hearing on Tuesday morning, this bill drew robust discussion from civil rights groups, digital reformers, and industry-allied organizations. This legislation is the first attempt at a comprehensive, national approach to, in the words of House Energy and Commerce Committee Chairman, Rep. Frank Pallone put “consumers back in control of their data and protecting their privacy.”
Under ADPPA, companies would have to obtain consumers’ consent for them to collect, process or transfer sensitive personal information. Affirmative consent would be required before the data of children between ages 13 and 17 could be transferred. The Federal Trade Commission (FTC) would form a Young Privacy Marketing Division to police the use of children’s data.
Best of all, the shadowy world of data brokers would be exposed to sunlight, with a public online registry created by FTC and third-party audits of how these brokers share information with others.
ADPPA would preempt some state privacy laws, while granting an exemption for the Illinois Biometrics Information Privacy Act (recently used to extract a sweeping settlement in the privacy practices of facial recognition provider Clearview AI), and California’s Privacy Rights Act. Other states with recent privacy laws are preempted, which Govtech.com writes “reeks of backroom dealing.”
The current draft includes a limited private right of action, which would allow individuals to bring suits for privacy violations after giving industry four years to adjust. Federal Trade Commission enforcement would be strengthened, and state attorneys general would be empowered to act against data holders who violate ADPPA. Companies would be given a limited right to cure a problem, which would give them standing to seek injunctive relief.
The discussion that took place in the House Subcommittee on Consumer Protection and Commerce reveals serious legislation with major issues to resolve. Here are a few of them.
How far should preemption of state privacy laws go?
Colorado, Texas, Virginia, Utah, and Connecticut have passed their own privacy laws. Will they eventually be excluded from preemption along with those of California and Illinois? If they are, do we run the risk of balkanizing the internet?
“American consumers and businesses deserve the clarity and certainty of a single federal standard for privacy,” said Former FTC Commissioner Maureen Ohlhausen.
Can we protect personal data by degrees of sensitivity without degrading the ability of digital commerce to function?
One goal of the bill is to have data minimization, which tasks companies with using only data that is needed for a given transaction. But can a law define the limits of what is needed?
John Miller of the Information Technology Industry Council noted that one provision, “information identifying an individual’s online activities over time or across third party websites or online services” could create restrictions for routine browsing. Or, as Ohlhausen put it, the bill “creates uncertainty for routine operational uses of information that are necessary to serve customers and operate a business.”
How broad should the private right of action be for individuals?
“The current proposal inserts several procedural hurdles that will not reduce litigation costs but will block injured individuals from having their day in court,” said David Brody, managing attorney of the Digital Justice Initiative Lawyers’ Committee for Civil Rights Under Law. “The private right of action in the Act is weak and difficult to enforce.”
John Miller countered, “while it is true neither punitive nor statutory damages are permitted” under the bill’s private right of action, “the availability of attorney’s fees could encourage the filing of borderline meritorious cases by specialized attorneys charging exorbitant hourly rates.”
Should government purchases of Americans’ personal data be included in the bill?
One issue that was not addressed on Tuesday is the frequent sale of Americans’ personal data to the government, a problem addressed by the proposed Fourth Amendment Is Not For Sale Act. Any privacy solution should look beyond the private uses of data by businesses to those of law enforcement and intelligence agencies. After all, only the government can use your information to bang down your door at dawn and arrest you.
There were further debates about how the bill might impact the ability of companies to handle cybersecurity threats, and whether small businesses would get tagged with onerous provisions aimed at tech giants. The legislative process in the House and Senate will have to untangle these and many other knotty issues to make this law workable. Yet the hearing room echoed with statements of determination by leaders in both parties to make a national privacy law a reality.
With the pandemic under control and the summer solstice two weeks away, millions of Americans are once again daring to travel to foreign destinations. Many might be concerned about world events intruding on the ability to travel. But few are ready for how intrusive government surveillance of our personal digital devices can be at the U.S. border.
This is a good time, then, to turn to the Electronic Frontier Foundation, and the primer written by Sophia Cope, Amul Kalia, Seth Schoen and Adam Schwartz on the legal, constitutional, and practical aspects of the government’s digital surveillance at the border. This paper, now a few years old, remains a thorough account of what happens at international airports, seaports and entry stations at U.S. land borders with Canada and Mexico.
On the practical side, EFF’s paper advises travelers on how to use encryption and cloud storage to prepare data for the U.S. border. It explains how Customs and Border Protection can worm past encryption and under some circumstances view your data on the cloud. It advises travelers on how to avoid behavior that attracts suspicion and how to calmly deal with requests for passwords into one’s devices.
The border is a privacy disaster because the sum of federal courts’ decisions leaves the Fourth Amendment at the border as more of an aspiration than a constitutional stricture on government behavior.
This hash of a doctrine arises out of the Supreme Court application of a “border search exception” to protect the integrity of the U.S. border. Courts parsed this doctrine to make distinctions between “routine” searches that do not require suspicion of a particular individual, and “highly intrusive” searches that impact the “dignity and privacy” of individuals (and yes, that’s exactly what it sounds like). The latter kind of search requires an “individualized suspicion.”
In a grey zone are searches of Americans’ and other travelers’ digital devices. PPSA has reported on the routine sweeping of Americans’ laptops, cellphones, tablets and other digital devices on returning to the United States from abroad. Electronic devices are searched at the border tens of thousands of times every year.
In denying police the ability to examine all the contents of a suspect’s cellphone without a warrant in Riley v. California (2014), the Supreme Court made an eloquent defense of digital technology as holding “the privacies of life.” Let us hope the courts take a closer look and find that this is just as true at the border.