The U.S. House of Representatives passed a major transparency measure by voice vote tonight. This amendment to the National Defense Authorization Act, offered by Rep. Sara Jacobs (D-CA) and Rep. Warren Davidson (R-OH), will require the Department of Defense to report the number of times it purchases the internet browsing and phone location data of Americans from private data brokers. The report will also include a general accounting of how the government uses this information.
PPSA commends Reps. Jacobs and Davidson on their steady leadership and articulate advocacy. Tonight’s success should provide momentum for the passage of the Fourth Amendment Is Not for Sale Act.
"Only Congress and the American people can decide whether we will remain a free society or succumb to technological totalitarianism."
A must read opinion piece in Real Clear World by our President, Erik Jaffe.
You might think that, given the severe restrictions on sharing Americans’ private health information under HIPAA, it would be illegal to sell data concerning your personal health information. You might also think, that given the need for a warrant imposed on cellphone location data by the U.S. Supreme Court in the Carpenter decision, it is also illegal to sell your location history tracked by your cellphone.
And, of course, you’d be wrong.
The $200 billion private data industry routinely sells not only your location information, but also your health data collected by apps and social media platforms. Not only can a large ecosystem of corporations buy your data, so can – and does – the government. From the FBI to the IRS, Department of Homeland Security and other law enforcement and intelligence agencies, the government routinely buys this data.
Now Democratic senators are rushing to make this practice illegal under the Health and Location Data Protection Act, sponsored by Sen. Elizabeth Warren (D-MA), and co-sponsored by Sen. Ron Wyden (D-OR), Sen. Patty Murray (D-WA), Sen. Sheldon Whitehouse (D-RI), and Sen. Bernie Sanders (D-VT).
This bill would ban data brokers from selling or transferring location data and health data under rules to be promulgated by the Federal Trade Commission. It would empower the FTC, state attorneys general and individuals to bring suits to enforce the provisions of the law. And it would add $1 billion in funding to the Federal Trade Commission budget.
So will this bill pass in this Congress? Not a chance. Since the Dobbs opinion from the U.S. Supreme Court that overturned Roe v. Wade, the Health and Location Data Protection Act has been spun by Democrats as a means of protecting women seeking abortions. It would protect, in Sen. Murray’s words, women from “extremist Republican lawmakers [who] work around the clock to criminalize essential health services.” From the pro-choice point of view, it is natural to include women’s reproductive freedom in the bill. From the pro-life point of view, supporting a bill that is being touted by others as a protection for reproductive freedom could be seen as supporting abortion.
Before Dobbs, such a bill would have had an excellent chance of securing bipartisan support and passage. Now that it is caught up in abortion politics, it has become a partisan talking point. It seems unlikely that either party will relent, and that the larger issue of our lack of privacy in health and location will remain caught up in the tug of war between pro-choice and pro-life forces.
The constructive course of action, one that can be taken by members of both parties, is to pass the Fourth Amendment Is Not for Sale Act, which has strong bipartisan support in both the House and Senate. This bill would at least require the government to obtain a probable cause warrant before examining our private information purchased from data brokers.
Letter Follows Leak About “Backdoor to Access User Data”
TikTok’s growth in the American market is explosive. In just the first quarter of 2022 alone, the short-video app has been downloaded 19 million times. In all, TikTok has about 80 million U.S. users – about one-fourth of the U.S. population.
TikTok is owned by a Chinese company, ByteDance. This is a matter of worry in Washington, since explicit Chinese law that requires Chinese companies share information with the intelligence agencies of the Chinese Communist Party. Given the massive torrents of data TikTok collects on its users, the Trump Administration proposed forcing ByteDance to sell off TikTok. The company allayed these concerns by promising to seal off any sensitive data from China. With TikTok storing U.S. user data in the United States, with backups in Singapore, the issued seemed resolved.
Then on June 17, Buzzfeed reported that leaked audio from dozens of TikTok internal meetings revealed that employees expressed concern U.S. user data was being accessed by China.
“I feel like these tools, there’s some backdoor to access user data in almost all of them,” one external auditor said.
“Everything is seen in China,” said a member of TikTok’s “trust and safety” department.
Another insider referred to a Beijing-based engineer as a “Master Admin” who has “access to everything.”
Now Brendan Carr, Republican member of the Federal Communications Commission, has written to CEOs Tim Cook and Sundar Pichai to ask that Apple and Google remove TikTok from their app stores.
Carr wrote that TikTok’s image as “an app for sharing funny videos or memes” is just “the sheep’s clothing.” He added:
“TikTok collects everything from search and browsing histories to keystroke patterns and biometric identifiers, including faceprints – which researchers have said might be used in unrelated facial recognition technology – and voiceprints. It collects location data as well as draft messages and metadata, plus it has collected the text, images, and videos that are stored on a device’s clipboard.”
TikTok has already paid $92 million to settle lawsuits that it clandestinely transferred vast amounts of Americans’ user data to China. India has banned TikTok, as have the U.S. military and many federal agencies. With so much criticism, TikTok is now working furiously on a project, codenamed “Project Texas,” to work out a deal with Oracle and the Committee on Foreign Investment in the United States to exclusively store sensitive user information on U.S. servers.
Commissioner Carr remains skeptical. “TikTok has long claimed that its U.S. user data has been stored on servers in the U.S., and yet those representations provided no protection against the data being accessed from Beijing.”
Some might find Carr’s demands to private businesses to be heavy handed. Whether or not Apple and Google should remove TikTok from their app stores, at a minimum Americans need to be aware that their personal data might be at risk.
When it comes to digital privacy, Americans feel like a well-dressed person caught in the rain without an umbrella. At first, you try to wait it out under an eave. Then you accept getting a little bit wet. Finally, when your clothes are thoroughly soaked, you just give up.
When it comes to digital privacy, Americans have long accepted we couldn’t get any wetter. The social media services and apps we use track and sell our location history, our contacts, our communications, our purchases and (most revealing) our web searches. These data points, like the dots in a pointillistic painting, create a portrait of users with great detail. These portraits are then sold by data brokers to government agencies and commercial entities.
A recent Apple commercial portrayed this process by putting a young woman’s virtual self on an auction block. In the ad, the heroine Ellie turns on Apple’s privacy devices, vaporizing her would-be auctioneers. But such controls on a smartphone only involve a small portion of the torrents of information that are collected about us and sold wholesale.
So just when many are ready to declare the death of privacy, a bicameral, bipartisan group of legislators have put forward a discussion draft of the American Data Privacy and Protection Act (ADPPA). In a House hearing on Tuesday morning, this bill drew robust discussion from civil rights groups, digital reformers, and industry-allied organizations. This legislation is the first attempt at a comprehensive, national approach to, in the words of House Energy and Commerce Committee Chairman, Rep. Frank Pallone put “consumers back in control of their data and protecting their privacy.”
Under ADPPA, companies would have to obtain consumers’ consent for them to collect, process or transfer sensitive personal information. Affirmative consent would be required before the data of children between ages 13 and 17 could be transferred. The Federal Trade Commission (FTC) would form a Young Privacy Marketing Division to police the use of children’s data.
Best of all, the shadowy world of data brokers would be exposed to sunlight, with a public online registry created by FTC and third-party audits of how these brokers share information with others.
ADPPA would preempt some state privacy laws, while granting an exemption for the Illinois Biometrics Information Privacy Act (recently used to extract a sweeping settlement in the privacy practices of facial recognition provider Clearview AI), and California’s Privacy Rights Act. Other states with recent privacy laws are preempted, which Govtech.com writes “reeks of backroom dealing.”
The current draft includes a limited private right of action, which would allow individuals to bring suits for privacy violations after giving industry four years to adjust. Federal Trade Commission enforcement would be strengthened, and state attorneys general would be empowered to act against data holders who violate ADPPA. Companies would be given a limited right to cure a problem, which would give them standing to seek injunctive relief.
The discussion that took place in the House Subcommittee on Consumer Protection and Commerce reveals serious legislation with major issues to resolve. Here are a few of them.
How far should preemption of state privacy laws go?
Colorado, Texas, Virginia, Utah, and Connecticut have passed their own privacy laws. Will they eventually be excluded from preemption along with those of California and Illinois? If they are, do we run the risk of balkanizing the internet?
“American consumers and businesses deserve the clarity and certainty of a single federal standard for privacy,” said Former FTC Commissioner Maureen Ohlhausen.
Can we protect personal data by degrees of sensitivity without degrading the ability of digital commerce to function?
One goal of the bill is to have data minimization, which tasks companies with using only data that is needed for a given transaction. But can a law define the limits of what is needed?
John Miller of the Information Technology Industry Council noted that one provision, “information identifying an individual’s online activities over time or across third party websites or online services” could create restrictions for routine browsing. Or, as Ohlhausen put it, the bill “creates uncertainty for routine operational uses of information that are necessary to serve customers and operate a business.”
How broad should the private right of action be for individuals?
“The current proposal inserts several procedural hurdles that will not reduce litigation costs but will block injured individuals from having their day in court,” said David Brody, managing attorney of the Digital Justice Initiative Lawyers’ Committee for Civil Rights Under Law. “The private right of action in the Act is weak and difficult to enforce.”
John Miller countered, “while it is true neither punitive nor statutory damages are permitted” under the bill’s private right of action, “the availability of attorney’s fees could encourage the filing of borderline meritorious cases by specialized attorneys charging exorbitant hourly rates.”
Should government purchases of Americans’ personal data be included in the bill?
One issue that was not addressed on Tuesday is the frequent sale of Americans’ personal data to the government, a problem addressed by the proposed Fourth Amendment Is Not For Sale Act. Any privacy solution should look beyond the private uses of data by businesses to those of law enforcement and intelligence agencies. After all, only the government can use your information to bang down your door at dawn and arrest you.
There were further debates about how the bill might impact the ability of companies to handle cybersecurity threats, and whether small businesses would get tagged with onerous provisions aimed at tech giants. The legislative process in the House and Senate will have to untangle these and many other knotty issues to make this law workable. Yet the hearing room echoed with statements of determination by leaders in both parties to make a national privacy law a reality.
Opinion piece by PPSA Senior Policy Advisor, Bob Goodlatte, on The Hill.
The FBI searches through databases of foreign communications in a program that Congress created specifically to catch foreign terrorists and spies. But the FBI uses this same program to glean private information about American citizens and our communications. These so-called “U.S. person queries” are transforming one of the most powerful and invasive surveillance authorities — Section 702 of the Foreign Intelligence Surveillance Act — into a means for FBI agents to spy on Americans without a warrant, gutting the Fourth Amendment of the Constitution.