Forbes reports that federal authorities were granted a court order to require Google to hand over the names, addresses, phone numbers, and user activities of internet surfers who were among the more than 30,000 viewers of a post. The government also obtained access to the IP addresses of people who weren’t logged onto the targeted account but did view its video.
The post in question is suspected of being used to promote the sale of bitcoin for cash, which would be a violation of money-laundering rules. The government likely had good reason to investigate that post. But did it have to track everyone who came into contact with it? This is a prime example of the government’s street-sweeper shotgun approach to surveillance. We saw this when law enforcement in Virginia tracked the location histories of everyone in the vicinity of a robbery. A state judge later found that search meant that everyone in the area, from restaurant patrons to residents of a retirement home, had “effectively been tailed.” We saw the government shotgun approach when the FBI secured the records of everyone in the Washington, D.C., area who used their debit or credit cards to make Bank of America ATM withdrawals between Jan. 5 and Jan. 7, 2021. We also saw it when the FBI, searching for possible foreign influence in a congressional campaign, used FISA Section 702 data – meant to surveil foreign threats on foreign soil – to pull the data of 19,000 political donors. Surfing the web is not inherently suspicious. What we watch online is highly personal, potentially revealing all manner of social, romantic, political, and religious beliefs and activities. The Founders had such dragnet-style searches precisely in mind when they crafted the Fourth Amendment. Simply watching a publicly posted video is not by itself probable cause for search. It should not compromise one’s Fourth Amendment rights. Byron Tau – journalist and author of Means of Control, How the Hidden Alliance of Tech and Government Is Creating a New American Surveillance State – discusses the details of his investigative reporting with Liza Goitein, senior director of the Brennan Center for Justice's Liberty & National Security Program, and Gene Schaerr, general counsel of the Project for Privacy and Surveillance Accountability.
Byron explains what he has learned about the shadowy world of government surveillance, including how federal agencies purchase Americans’ most personal and sensitive information from shadowy data brokers. He then asks Liza and Gene about reform proposals now before Congress in the FISA Section 702 debate, and how they would rein in these practices. End-to-end encryption, in which only the sender and recipient have access to a message, is the saving grace of the online world, the last little bit of privacy most of us can expect to have in this era of near-ubiquitous surveillance. Tens of billions of encrypted messages are sent every day between users of WhatsApp, Signal, Apple’s iMessage and many other apps.
The central importance of encryption to privacy is described in an amicus brief by the American Civil Liberties Union, the Center for Democracy and Technology, the Electronic Frontier Foundation, Mozilla, and several other activist groups and corporations. They stand in opposition to a preliminary injunction request by Nevada Attorney General Aaron Ford in his lawsuit to stop Meta from launching a new encrypted version of its Messenger app, ostensibly because it would pose a new threat to the safety of children. The facts are on Meta’s side. End-to-end encryption has been an optional feature of Messenger for eight years. Attorney General Ford ignores the host of other encrypted services millions of Americans use, singling out Meta as a test case. If he were to succeed in breaking open Messenger’s encryption, the attorney general would in essence be setting a precedent for the nation, maybe even for the world. The clear and passionate language of the civil liberties amicus brief gets to the heart of what is at stake: “Society has long recognized that people thrive when we have the ability to engage in private, unmonitored conversations. Sharing confidences enables people to form friendships and intimate relationships, obtain information about sensitive matters, and construct different identities depending on the audience. We know this from our own lives, whether engaging in pillow talk, meeting a friend for a walk, or forming an invitation-only club. Important, human things happen when we can be confident that no one is listening in.” Nothing about end-to-end encryption prevents law enforcement from accessing the message from either the recipient or the sender. But preventing companies from providing security, as the Nevada AG seeks to do, creates security risks from bad actors, including both criminals and government officials who would abuse their power by illegally accessing messages. The brief quotes respected child protection organizations that encrypted channels protect children from violent family members, stalkers, and predators, and in parts of the world where there is armed conflict. Hackers in 2015 stole five million customer details from a children’s technology and toy firm, including sensitive information. Because these chats between parents and children were unencrypted, the leak gave criminals the names, ages, and addresses of millions of children. The amici also write of just how onslaughts against privacy, like that of the Nevada Attorney General, break with American tradition. They write: “In any other era, a claim that government may obligate us to record and preserve our conversations, just in case investigators wanted to review them later, would be laughably ridiculous. It would simply have been beyond the pale to suggest that people could be required to record their conversations in a language that law enforcement could readily understand and access. Basic conversational privacy was assumed, and rightly so.” Legal precedent is also on the side of civil liberties. The Ninth Circuit recognized encryption’s importance “to reclaim some portion of the privacy we have lost” 25 years ago in rejecting the U.S. government’s export restrictions on strong cryptography. (See EFF on Bernstein v. Department of Justice) While advocates of privacy have a solid chance of prevailing in state court in Las Vegas, encryption is endangered across the pond by Section 122 of the United Kingdom’s Online Safety Act, passed in late 2023. The law requires companies to use technology that would scan users’ messages to make sure they are not transmitting illegal content, like Child Sexual Abuse Material. Doing this without breaking end-to-end encryption is currently impossible. The UK’s internet regulator, Ofcom, has relented in requiring content monitoring, for now, for the simple reason that such technology does not yet exist. With developments in AI, however, it might come sooner than we think. Matthew Hodgson, CEO of Element, told WIRED, that such scanning tech would undermine encryption and provide “a mechanism where bad actors of any kind could compromise the scanning system in order to steal the data flying around the place.” Anti-encryption regulators only to need to win in one jurisdiction to threaten the viability of encryption globally, from women and children hiding from abusive situations to dissidents living in dictatorships. From London to Las Vegas, encryption – and privacy – are at risk. A federal court has given the go-ahead for a lawsuit filed by Just Futures Law and Edelson PC against Western Union for its involvement in a dragnet surveillance program called the Transaction Record Analysis Center (TRAC).
Since 2022, PPSA has followed revelations on a unit of the Department of Homeland Security that accesses bulk data on Americans’ money wire transfers above $500. TRAC is the central clearinghouse for this warrantless information, recording wire transfers sent or received in Arizona, California, New Mexico, Texas, and Mexico. These personal, financial transactions are then made available to more than 600 law enforcement agencies – almost 150 million records – all without a warrant. Much of what we know about TRAC was unearthed by a joint investigation between ACLU and Sen. Ron Wyden (D-OR). In 2023, Gene Schaerr, PPSA general counsel, said: “This purely illegal program treats the Fourth Amendment as a dish rag.” Now a federal judge in Northern California determined that the plaintiffs in Just Future’s case allege plausible violations of California laws protecting the privacy of sensitive financial records. This is the first time a court has weighed in on the lawfulness of the TRAC program. We eagerly await revelations and a spirited challenge to this secretive program. The TRAC intrusion into Americans’ personal finances is by no means the only way the government spies on the financial activities of millions of innocent Americans. In February, a House investigation revealed that the U.S. Treasury’s Financial Crimes Enforcement Network (FinCEN) has worked with some of the largest banks and private financial institutions to spy on citizens’ personal transactions. Law enforcement and private financial institutions shared customers’ confidential information through a web portal that connects the federal government to 650 companies that comprise two-thirds of the U.S. domestic product and 35 million employees. TRAC is justified by being ostensibly about the border and the activities of cartels, but it sweeps in the transactions of millions of Americans sending payments from one U.S. state to another. FinCEN set out to track the financial activities of political extremists, but it pulls in the personal information of millions of Americans who have done nothing remotely suspicious. Groups on the left tend to be more concerned about TRAC and groups on the right, led by House Judiciary Chairman Jim Jordan, are concerned about the mass extraction of personal bank account information. The great thing about civil liberties groups today is their ability to look beyond ideological silos and work together as a coalition to protect the rights of all. For that reason, PPSA looks forward to reporting and blasting out what is revealed about TRAC in this case in open court. Any revelations from this case should sink in across both sides of the aisle in Congress, informing the debate over America’s growing surveillance state. The reform coalition on Capitol Hill remains determined to add strong amendments to Section 702 of the Foreign Intelligence Surveillance Act (FISA). But will they get the chance before an April 19th deadline for FISA Section 702’s reauthorization?
There are several possible scenarios as this deadline closes. One of them might be a vote on the newly introduced “Reforming Intelligence and Securing America” (RISA) Act. This bill is a good-faith effort to represent the narrow band of changes that the pro-reform House Judiciary Committee and the status quo-minded House Permanent Select Committee on Intelligence could agree upon. But is it enough? RISA is deeply lacking because it leaves out two key reforms.
The bill does include a role for amici curiae, specialists in civil liberties who would act as advisors to the secret FISA court. RISA, however, would limit the issues these advisors could address, well short of the intent of the Senate when it voted 77-19 in 2020 to approve the robust amici provisions of the Lee-Leahy amendment. For all these reasons, reformers should see RISA as a floor, not as a ceiling, as the Section 702 showdown approaches. The best solution to the current impasse is to stop denying Members of Congress the opportunity for a straight up-or-down vote on reform amendments. How to Tell if You are Being Tracked Car companies are collecting massive amounts of data about your driving – how fast you accelerate, how hard you brake, and any time you speed. These data are then analyzed by LexisNexis or another data broker to be parsed and sold to insurance companies. As a result, many drivers with clean records are surprised with sudden, large increases in their car insurance payments.
Kashmir Hill of The New York Times reports the case of a Seattle man whose insurance rates skyrocketed, only to discover that this was the result of LexisNexis compiling hundreds of pages on his driving habits. This is yet another feature of the dark side of the internet of things, the always-on, connected world we live in. For drivers, internet-enabled services like navigation, roadside assistance, and car apps are also 24-7 spies on our driving habits. We consent to this, Hill reports, “in fine print and murky privacy policies that few read.” One researcher at Mozilla told Hill that it is “impossible for consumers to try and understand” policies chocked full of legalese. The good news is that technology can make data gathering on our driving habits as transparent as we are to car and insurance companies. Hill advises:
What you cannot do, however, is file a report with the FBI, IRS, the Department of Homeland Security, or the Pentagon to see if government agencies are also purchasing your private driving data. Given that these federal agencies purchase nearly every electron of our personal data, scraped from apps and sold by data brokers, they may well have at their fingertips the ability to know what kind of driver you are. Unlike the private snoops, these federal agencies are also collecting your location histories, where you go, and by inference, who you meet for personal, religious, political, or other reasons. All this information about us can be accessed and reviewed at will by our government, no warrant needed. That is all the more reason to support the inclusion of the principles of the Fourth Amendment Is Not for Sale Act in the reauthorization of the FISA Section 702 surveillance policy. While Congress debates adding reforms to FISA Section 702 that would curtail the sale of Americans’ private, sensitive digital information to federal agencies, the Federal Trade Commission is already cracking down on companies that sell data, including their sales of “location data to government contractors for national security purposes.”
The FTC’s words follow serious action. In January, the FTC announced proposed settlements with two data aggregators, X-Mode Social and InMarket, for collecting consumers’ precise location data scraped from mobile apps. X-Mode, which can assimilate 10 billion location data points and link them to timestamps and unique persistent identifiers, was targeted by the FTC for selling location data to private government contractors without consumers’ consent. In February, the FTC announced a proposed settlement with Avast, a security software company, that sold “consumers’ granular and re-identifiable browsing information” embedded in Avast’s antivirus software and browsing extensions. What is the legal basis for the FTC’s action? The agency seems to be relying on Section 5 of the Federal Trade Commission Act, which grants the FTC power to investigate and prevent deceptive trade practices. In the case of X-Mode, the FTC’s proposed complaint highlight’s X-Mode’s statement that their location data would be used solely for “ad personalization and location-based analytics.” The FTC alleges X-Mode failed to inform consumers that X-Mode “also sold their location data to government contractors for national security purposes.” The FTC’s evolving doctrine seems even more expansive, weighing the stated purpose of data collection and handling against its actual use. In a recent blog, the FTC declares: “Helping people prepare their taxes does not mean tax preparation services can use a person’s information to advertise, sell, or promote products or services. Similarly, offering people a flashlight app does not mean app developers can collect, use, store, and share people’s precise geolocation information. The law and the FTC have long recognized that a need to handle a person’s information to provide them a requested product or service does not mean that companies are free to collect, keep, use, or share that’s person’s information for any other purpose – like marketing, profiling, or background screening.” What is at stake for consumers? “Browsing and location data paint an intimate picture of a person’s life, including their religious affiliations, health and medical conditions, financial status, and sexual orientation.” If these cases go to court, the tech industry will argue that consumers don’t sign away rights to their private information when they sign up for tax preparation – but we all do that routinely when we accept the terms and conditions of our apps and favorite social media platforms. The FTC’s logic points to the common understanding that our data is collected for the purpose of selling us an ad, not handing over our private information to the FBI, IRS, and other federal agencies. The FTC is edging into the arena of the Fourth Amendment Is Not for Sale Act, which targets government purchases and warrantless inspection of Americans’ personal data. The FTC’s complaints are, for the moment, based on legal theory untested by courts. If Congress attaches similar reforms to the reauthorization of FISA Section 702, it would be a clear and hard to reverse protection of Americans’ privacy and constitutional rights. Ken Blackwell, former ambassador and mayor of Cincinnati, has a conservative resume second to none. He is now a senior fellow of the Family Research Council and chairman of the Conservative Action Project, which organizes elected conservative leaders to act in unison on common goals. So when Blackwell writes an open letter in Breitbart to Speaker Mike Johnson warning him not to try to reauthorize FISA Section 702 in a spending bill – which would terminate all debate about reforms to this surveillance authority – you can be sure that Blackwell was heard.
“The number of FISA searches has skyrocketed with literally hundreds of thousands of warrantless searches per year – many of which involve Americans,” Blackwell wrote. “Even one abuse of a citizen’s constitutional rights must not be tolerated. When that number climbs into the thousands, Congress must step in.” What makes Blackwell’s appeal to Speaker Johnson unique is he went beyond including the reform efforts from conservative stalwarts such as House Judiciary Committee Chairman Jim Jordan and Rep. Andy Biggs of the Freedom Caucus. Blackwell also cited the support from the committee’s Ranking Member, Rep. Jerry Nadler, and Rep. Pramila Jayapal, who heads the House Progressive Caucus. Blackwell wrote: “Liberal groups like the ACLU support reforming FISA, joining forces with conservatives civil rights groups. This reflects a consensus almost unseen on so many other important issues of our day. Speaker Johnson needs to take note of that as he faces pressure from some in the intelligence community and their overseers in Congress, who are calling for reauthorizing this controversial law without major reforms and putting that reauthorization in one of the spending bills that will work its way through Congress this month.” That is sound advice for all Congressional leaders on Section 702, whichever side of the aisle they are on. In December, members of this left-right coalition joined together to pass reform measures out of the House Judiciary Committee by an overwhelming margin of 35 to 2. This reform coalition is wide-ranging, its commitment is deep, and it is not going to allow a legislative maneuver to deny Members their right to a debate. U.S. Treasury and FBI Targeted Americans for Political BeliefsThe House Judiciary Committee and its Select Subcommittee on the Weaponization of the Federal Government issued a report on Wednesday revealing secretive efforts between federal agencies and U.S. private financial institutions that “show a pattern of financial surveillance aimed at millions of Americans who hold conservative viewpoints or simply express their Second Amendment rights.”
At the heart of this conspiracy is the U.S. Treasury Department’s Financial Crimes Enforcement Network (FinCEN) and the FBI, which oversaw secret investigations with the help of the largest U.S. banks and financial institutions. They did not lack for resources. Law enforcement and private financial institutions shared customers’ confidential information through a web portal that connects the federal government to 650 companies that comprise two-thirds of the U.S. domestic product and 35 million employees. This dragnet investigation grew out of the aftermath of the Jan. 6 riot in the U.S. Capitol, but it quickly widened to target the financial transactions of anyone suspiciously MAGA or conservative. Last year we reported on how the Bank of America volunteered the personal information of any customer who used an ATM card in the Washington, D.C., area around the time of the riot. In this newly revealed effort, the FBI asked financial services companies to sweep their database to look for digital transactions with keywords like “MAGA” and “Trump.” FinCEN also advised companies how to use Merchant Category Codes (MCC) to search through transactions to detect potential “extremists.” Keywords attached to suspicious transactions included recreational stores Cabela’s, Bass Pro Shop, and Dick’s Sporting Goods. The committee observed: “Americans doing nothing other than shopping or exercising their Second Amendment rights were being tracked by financial institutions and federal law enforcement.” FinCEN also targeted conservative organizations like the Alliance Defending Freedom or the Eagle Forum for being demonized by a left-leaning organization, the Institute for Strategic Dialogue in London, as “hate groups.” The committee report added: “FinCEN’s incursion into the crowdfunding space represents a trend in the wrong direction and a threat to American civil liberties.” One doesn’t have to condone the breaching of the Capitol and attacks on Capitol police to see the threat of a dragnet approach that lacked even a nod to the concept of individualized probable cause. What was done by the federal government to millions of ordinary American conservatives could also be done to millions of liberals for using terms like “racial justice” in the aftermath of the riots that occurred after the murder of George Floyd. These dragnets are general warrants, exactly the kind of sweeping, indiscriminate violations of privacy that prompted this nation’s founders to enact the Fourth Amendment. If government agencies cannot satisfy the low hurdle of probable cause in an application for a warrant, they are apt to be making things up or employing scare tactics. If left uncorrected, financial dragnets like these will support a default rule in which every citizen is automatically a suspect, especially if the government doesn’t like your politics. The growth of the surveillance state in Washington, D.C., is coinciding with a renewed determination by federal agencies to expose journalists’ notes and sources. Recent events show how our Fourth Amendment right against unreasonable searches and seizures and our First Amendment right of a free press are inextricable and mutually reinforcing – that if you degrade one of these rights, you threaten both of them.
In May, the FBI raided the home of journalist Tim Burke, seizing his computer, hard drives, and cellphone, after he reported on embarrassing outtakes of a Fox News interview. It turns out these outtakes had already been posted online. Warrants were obtained, but on what credible allegation of probable cause? Or consider CBS News senior correspondent Catherine Herridge who was laid off, then days later ordered by a federal judge to reveal the identity of a confidential source she used for a series of 2017 stories published while she worked at Fox News. Shortly afterwards, Herridge was held in contempt for refusing to divulge that source. This raises the question that when CBS had earlier terminated Herridge and seized her files, would network executives have been willing to put their freedom on the line as Herridge has done? In response to public outcry, CBS relented and handed Herridge’s notes back to her. But local journalists cannot count on generating the national attention and sympathy that a celebrity journalist can. Now add to this vulnerability the reality that every American who is online – whether a national correspondent or a college student – has his or her sensitive and personal information sold to more than a dozen federal agencies by data brokers, a $250 billion industry that markets our data in the shadows. The sellers of our privacy compile nearly limitless data dossiers that “reveal the most intimate details of our lives, our movements, habits, associations, health conditions, and ideologies.” Data brokers have established a sophisticated system to aggregate data from nearly every platform and device that records personal information to develop detailed profiles on individuals. To fill in the blanks, they also sweep up information from public records. So if you have a smartphone, apps, or search online, your life is already an open book to the government. In this way, state and federal intelligence and law enforcement agencies can use the data broker loophole to obtain information about Americans that they would otherwise need a warrant, court order, or subpoena to obtain. Now imagine what might happen as these two trends converge – a government hungry to expose journalists’ sources, but one that also has access to a journalist’s location history, as well as everyone they have called, texted, and emailed. It is hardly paranoid, then, to worry that when a prosecutor tries to compel a journalist to give up a source through legal means, purchased data may have already given the government a road map on what to seek. The combined threat to privacy from pervasive surveillance and prosecutors seeking journalists’ notes is serious and growing. This is why PPSA supports legislation to protect journalistic privacy and close the data broker loophole. The Protect Reporters from Exploitive State Spying, or PRESS Act, would grant a privilege to protect confidential news sources in federal legal proceedings, while offering reasonable exceptions for extreme situations. Such “shield laws” have been put into place in 49 states. The PRESS Act, which passed the House in January with unanimous, bipartisan support, would bring the federal government in line with the states. Likewise, the Fourth Amendment Is Not For Sale Act would close the data broker loophole and require the government to obtain a warrant before it can seize our personal information, as required by the Fourth Amendment of the U.S. Constitution. The House Judiciary Committee voted to advance the Fourth Amendment Is Not For Sale Act out of committee with strong bipartisan support in July. The Judiciary Committee also reported out a strong data broker loophole closure as part of the Protect Liberty Act in December. Now, it’s up to Congress to include these protection and reform measures in the reauthorization of Section 702. PPSA urges lawmakers to pass measures to protect privacy and a free press. They will rise or fall together. The Biden Administration has placed the people, the industry, and the national security of the United States on the edge of a cyber cliff and is threatening to push us all off.
Does that sound alarmist? Consider: Wikipedia brings together thousands of volunteers to curate a free, online encyclopedia about – well, everything – including the policies and personalities of repressive, homicidal regimes from Russia, to China, to North Korea. In the last decade, the Wikimedia Foundation, the non-profit that hosts Wikipedia, has received increasing requests to provide user data to governments and wealthy individuals. These foreign appeals not only seek to bowdlerize accurate information and censor editorial content, they also ask for personal data to enable retaliation against the volunteers who edit Wikipedia. On one level, this is actually kind of funny. Dictators and cartel bosses who rule by terror at home are reduced to making polite requests to the Wikimedia Foundation because the current system denies them local access to Wikipedia data. The architecture of an open internet, which forbids forced data localization, thus throws up roadblocks for malevolent foreign interests that would access Americans’ online, personal information. Now Americans’ privacy and the security of U.S. data is completely at risk because of U.S. Trade Representative Katherine Tai’s astonishing withdrawal of support for the underpinnings of a global internet before the World Trade Organization. Tai’s move leaves the Biden Administration moving in opposite directions at once. With one hand, the Biden Administration recently issued an executive order cracking down on the sale of Americans’ personal data by data brokers to foreign “countries of concern.” With the other hand – the president’s trade representative – the U.S. offered to drop its long-standing opposition to forced data localization and to forced transfers of American tech companies’ algorithms to governments around the world. Tai would hand the keys to America’s digital kingdom to more than 80 countries, including China. It is not only Americans who will be at risk, but political dissidents and religious minorities around the world. “Growing requirements for data localization are happening alongside a global crackdown on free expression,” wrote the American Civil Liberties Union, the Center for Democracy & Technology, Freedom House, Information Technology and Innovation Foundation, Internet Society, PEN America, and the Wikimedia Foundation. “And people’s personal data – which can reveal who they voted for, who they worship, and who they love – can help facilitate this … 78 percent of the world’s internet users live in countries where simply expressing political, social, and religious viewpoints leads to legal repercussions.” The Biden Administration’s forced disclosure of source codes will undermine the national and personal security of our country. Why? And for what? We are not sure, but it is clear that it would put all Americans’ privacy and personal security at risk. David Pierce has an insightful piece in The Verge demonstrating the latest example of why every improvement in online technology leads to a yet another privacy disaster.
He writes about an experiment by OpenAI to make ChatGPT “feel a little more personal and a little smarter.” The company is now allowing some users to add memory to personalize this AI chatbot. Result? Pierce writes that “the idea of ChatGPT ‘knowing’ users is both cool and creepy.” OpenAI says it will allow users to remain in control of ChatGPT’s memory and be able to tell it to remove something it knows about you. It won’t remember sensitive topics like your health issues. And it has a temporary chat mode without memory. Credit goes to OpenAI for anticipating the privacy implications of a new technology, rather than blundering ahead like so many other technologists to see what breaks. OpenAI’s personal memory experiment is just another sign of how intimate technology is becoming. The ultimate example of online AI intimacy is, of course, the so-called “AI girlfriend or boyfriend” – the artificial romantic partner. Jen Caltrider of Mozilla’s Privacy Not Included team told Wired that romantic chatbots, some owned by companies that can’t be located, “push you toward role-playing, a lot of sex, a lot of intimacy, a lot of sharing.” When researchers tested the app, they found it “sent out 24,354 ad trackers within one minute of use.” We would add that data from these ads could be sold to the FBI, the IRS, or perhaps a foreign government. The first wave of people whose lives will be ruined by AI chatbots will be the lonely and the vulnerable. It is only a matter of time before sophisticated chatbots become ubiquitous sidekicks, as portrayed in so much near-term science fiction. It will soon become all too easy to trust a friendly and helpful voice, without realizing the many eyes and ears behind it. Just in time for the Section 702 debate, Emile Ayoub and Elizabeth Goitein of the Brennan Center for Justice have written a concise and easy to understand primer on what the data broker loophole is about, why it is so important, and what Congress can do about it.
These authors note that in this age of “surveillance capitalism” – with a $250 billion market for commercial online data – brokers are compiling “exhaustive dossiers” that “reveal the most intimate details of our lives, our movements, habits, associations, health conditions, and ideologies.” This happens because data brokers “pay app developers to install code that siphons users’ data, including location information. They use cookies or other web trackers to capture online activity. They scrape from information public-facing sites, including social media platforms, often in violation of those platforms’ terms of service. They also collect information from public records and purchase data from a wide range of companies that collect and maintain personal information, including app developers, internet service providers, car manufacturers, advertisers, utility companies, supermarkets, and other data brokers.” Armed with all this information, data brokers can easily “reidentify” individuals from supposedly “anonymized” data. This information is then sold to the FBI, IRS, the Drug Enforcement Administration, the Department of Defense, the Department of Homeland Security, and state and local law enforcement. Ayoub and Goitein examine how government lawyers employ legal sophistry to evade a U.S. Supreme Court ruling against the collection of location data, as well as the plain meaning of the U.S. Constitution, to access Americans’ most personal and sensitive information without a warrant. They describe the merits of the Fourth Amendment Is Not For Sale Act, and how it would shut down “illegitimately obtained information” from companies that scrape photos and data from social media platforms. The latter point is most important. Reformers in the House are working hard to amend FISA Section 702 with provisions from the Fourth Amendment Is Not For Sale Act, to require the government to obtain warrants before inspecting our commercially acquired data. While the push is on to require warrants for Americans’ data picked up along with international surveillance, the job will be decidedly incomplete if the government can get around the warrant requirement by simply buying our data. Ayoub and Goitein conclude that Congress must “prohibit government agencies from sidestepping the Fourth Amendment.” Read this paper and go here to call your House Member and let them know that you demand warrants before the government can access our sensitive, personal information. The word from Capitol Hill is that Speaker Mike Johnson is scheduling a likely House vote on the reauthorization of FISA’s Section 702 this week. We are told that proponents and opponents of surveillance reform will each have an opportunity to vote on amendments to this statute.
It is hard to overstate how important this upcoming vote is for our privacy and the protection of a free society under the law. The outcome may embed warrant requirements in this authority, or it may greatly expand the surveillance powers of the government over the American people. Section 702 enables the U.S. intelligence community to continue to keep a watchful eye on spies, terrorists, and other foreign threats to the American homeland. Every reasonable person wants that, which is why Congress enacted this authority to allow the government to surveil foreign threats in foreign lands. Section 702 authority was never intended to become what it has become: a way to conduct massive domestic surveillance of the American people. Government agencies – with the FBI in the lead – have used this powerful, invasive authority to exploit a backdoor search loophole for millions of warrantless searches of Americans’ data in recent years. In 2021, the secret Foreign Intelligence Surveillance Court revealed that such backdoor searches are used by the FBI to pursue purely domestic crimes. Since then, declassified court opinions and compliance reports reveal that the FBI used Section 702 to examine the data of a House Member, a U.S. Senator, a state judge, journalists, political commentators, 19,000 donors to a political campaign, and to conduct baseless searches of protesters on both the left and the right. NSA agents have used it to investigate prospective and possible romantic partners on dating apps. Any reauthorization of Section 702 must include warrants – with reasonable exceptions for emergency circumstances – before the data of Americans collected under Section 702 or any other search can be queried, as required by the U.S. Constitution. This warrant requirement must include the searching of commercially acquired information, as well as data from Americans’ communications incidentally caught up in the global communications net of Section 702. The FBI, IRS, Department of Homeland Security, the Pentagon, and other agencies routinely buy Americans’ most personal, sensitive information, scraped from our apps and sold to the government by data brokers. This practice is not authorized by any statute, or subject to any judicial review. Including a warrant requirement for commercially acquired information as well as Section 702 data is critical, otherwise the closing of the backdoor search loophole will merely be replaced by the data broker loophole. If the House declines to impose warrants for domestic surveillance, expect many politically targeted groups to have their privacy and constitutional rights compromised. We cannot miss the best chance we’ll have in a generation to protect the Constitution and what remains of Americans’ privacy. Copy and paste the message below and click here to find your U.S. Representative and deliver it: “Please stand up for my privacy and the Fourth Amendment to the U.S. Constitution: Vote to reform FISA’s Section 702 with warrant requirements, both for Section 702 data and for our sensitive, personal information sold to government agencies by data brokers.” Government Agencies Pose as Ad Bidders We’ve long reported on the government’s purchase of Americans’ sensitive and personal information scraped from our apps and sold to federal agencies by third-party data brokers. Closure of this data broker loophole is included in the House Judiciary Committee bill – the Protect Liberty and End Warrantless Surveillance Act – legislation that requires probable cause warrants before the federal government can inspect Americans’ data caught up in foreign intelligence under Section 702 of the Foreign Intelligence Surveillance Act. Of no less importance, the bipartisan Protect Liberty Act also requires warrants for inspection of the huge mass of Americans’ data sold to the government.
Thanks to Ben Lovejoy of the 9 to 5 Mac, we now know of the magnitude of the need for a legislative solution to this privacy vulnerability. Apple’s 2020 move to require app makers to notify you that you’re being tracked on your iPhone has been thoroughly undermined by a workaround through the technology of device fingerprinting. Add to that Patternz, a commercial spyware that extracts personal information from ads and push notifications so it can be sold. Patternz tracks up to 5 billion users a day, utterly defeating phone-makers’ attempts to protect consumer privacy. How does it work? 404 Media demonstrated that Patternz has deals with myriad small ad agencies to extract information from around 600,000 apps. In a now-deleted video, an affiliate of the company boasted that with this capability, it could track consumers’ locations and movements in real time. After this article was posted, Google acted against one such market participant, while Apple promises a response. But given the robustness of these tools, it is hard to believe that new corporate policies will be effective. That is because technology allows government agencies to pose as ad buyers to turn adware into a global tracking tool that federal agencies – and presumably the intelligence services of other governments – can access at will. Patternz can even install malware for more thorough and deeper penetration of customers’ phones and their sensitive information. It is almost as insidious as the zero-day malware Pegasus, transforming phones into 24/7 spy devices. Enter Patrick Eddington, senior fellow of the Cato Institute. He writes: “If you’re a prospective or current gun owner and you use your smartphone to go to OpticsPlanet to look for a new red dot sight, then go to Magpul for rail and sling adapters for the modern sporting rifle you’re thinking of buying, then mosey on over to LWRC to look at their latest gas piston AR-15 offerings, and finally end up at Ammunition Depot to check out their latest sale on 5.56mm NATO standard rounds, unless those retailers expressly offer you the option ‘Do not sell my personal data’ … all of your online browsing and ordering activity could end up being for sale to a federal law enforcement agency. “Or maybe even the National Security Agency.” The government’s commercial acquisition of Americans’ personal information from data sales contains troubling implications for both left and right – from abortion-rights activists concerned about women being tracked to clinics, to conservatives who care about the implications of this practice for the Second Amendment or free religious expression, to Americans of all stripes who don’t want our personal and political activities monitored in minute detail by the government. In January, the NSA admitted that it buys our personal information without a warrant. The investigative work performed by 404 Media and 9 to 5 Mac should give Members of Congress all the more reason to support the Protect Liberty Act. Wired reports that police in northern California asked Parabon NanoLabs to run a DNA sample from a cold case murder scene to identify the culprit. Police have often run DNA against the vast database of genealogical tests, cracking cold cases like the Golden State Killer, who murdered at least 13 people.
But what Parabon NanoLabs did for the police in this case was something entirely different. The company produced a 3D rendering of a “predicted face” based on the genetic instructions encoded in the sample’s DNA. The police then ran it against facial recognition software to look for a match. Scientists are skeptical that this is an effective tool given that Parabon’s methods have not been peer-reviewed. Even the company’s director of bioinformatics, Ellen Greytak, told Wired that such face predictions are closer in accuracy to a witness description rather than the exact replica of a face. With the DNA being merely suggestive – Greytak jokes that “my phenotyping can tell you if your suspect has blue eyes, but my genealogist can tell you the guy’s address” – the potential for false positives is enormous. Police multiply that risk when they run a predicted face through the vast database of facial recognition technology (FRT) algorithms, technology that itself is far from perfect. Despite cautionary language from technology producers and instructions from police departments, many detectives persist in mistakenly believing that FRT returns matches. Instead, it produces possible candidate matches arranged in the order of a “similarity score.” FRT is also better with some types of faces than others. It is up to 100 times more likely to misidentify Asian and Black people than white men. The American Civil Liberties Union, in a thorough 35-page comment to the federal government on FRT, biometric technologies, and predictive algorithms, noted that defects in FRT are likely to multiply when police take a low-quality image and try to brighten it, or reduce pixelation, or otherwise enhance the image. We can only imagine the Frankenstein effect of mating a predicted face with FRT. As PPSA previously reported, rights are violated when police take a facial match not as a clue, but as evidence. This is what happened when Porcha Woodruff, a 32-year-old Black woman and nursing student in Detroit, was arrested on her doorstep while her children cried. Eight months pregnant, she was told by police that she had committed recent carjackings and robberies – even though the woman committing the crimes in the images was not visibly pregnant. Woodruff went into contractions while still in jail. In another case, local police executed a warrant by arresting a Georgia man at his home for a crime committed in Louisiana, even though the arrestee had never set foot in Louisiana. The only explanation for such arrests is sheer laziness, stupidity, or both on the part of the police. As ACLU documents, facial recognition forms warn detectives that a match “should only be considered an investigative lead. Further investigation is needed to confirm a match through other investigative corroborated information and/or evidence. INVESTIGATIVE LEAD, NOT PROBABLE CAUSE TO MAKE AN ARREST.” In the arrests made in Detroit and Georgia, police had not performed any of the rudimentary investigative steps that would have immediately revealed that the person they were investigating was innocent. Carjacking and violent robberies are not typically undertaken by women on the verge of giving birth. The potential for replicating error in the courtroom would be multiplied by showing a predicted face to an eyewitness. If a witness is shown a predicted face, that could easily influence the witness’s memory when presented with a line-up. We understand that an investigation might benefit from knowing that DNA reveals that a perp has blue eyes, allowing investigators to rule out all brown- and green-eyed suspects. But a predicted face should not be enough to search through a database of innocent people. In fact, any searches of facial recognition databases should require a warrant. As technology continues to push the boundaries, states need to develop clear procedural guidelines and warrant requirements that protect constituents’ constitutional rights. While Congress is locked in spirited debate over the limits of surveillance in America, large technology companies are responding to growing consumer concerns about privacy by reducing government’s warrantless access to data.
For years, police had a free hand in requesting from Google the location histories of groups of people in a given vicinity recorded on Google Maps. Last month, Google altered the Location History feature on Google Maps. For users who enable this feature to track where they’ve been, their location histories will now be saved on their smartphone or other devices, not on Google servers. As a result of this change, Google will be unable to respond to geofenced warrants. “Your location information is personal,” Google announced. “We’re committed to keeping it safe, private and in your control.” This week, Amazon followed Google’s lead by disabling its Request for Access tool, a feature that facilitated requests from law enforcement to ask Ring camera owners to give up video of goings on in the neighborhood. We reported three years ago that Amazon had cooperative agreements with more than 2,000 police and fire departments to solicit Ring videos for neighborhood surveillance from customers. By clicking off Request for Access, Amazon is now closing the channel for law enforcement to ask Ring customers to volunteer footage about their neighbors. PPSA commends Google and Amazon for taking these steps. But they wouldn’t have made these changes if consumers weren’t clamoring for a restoration of the expectation of privacy. These changes are a sure sign that the mounting complaints of civil liberties advocates are moving the needle of public opinion. Corporations are exquisitely attuned to consumer attitudes, and so they are listening and acting. In the wake of Thursday’s revelation that the National Security Agency is buying Americans’ location data, we urge Congress to show similar sensitivity. With polls showing that nearly four out of five Americans support strong surveillance reform, Congress should respond to public opinion by passing The Protect Liberty Act, which imposes a warrant requirement on all personal information purchased by government agencies. Late last year, Sen. Ron Wyden (D-OR) put a hold on the appointment of Lt. Gen. Timothy Haugh to replace outgoing National Security Agency director Gen. Paul Nakasone. Late Thursday, Sen. Wyden’s pressure campaign yielded a stark result – a frank admission from Gen. Nakasone that, as long suspected, the NSA purchases Americans’ sensitive, personal online activities from commercial data brokers.
The NSA admitted it buys netflow data, which records connections between computers and servers. Even without the revelation of messages’ contents, such tracking can be extremely personal. A Stanford University study of telephone metadata showed that a person’s calls and texts can reveal connections to sensitive life issues, from Alcoholics Anonymous to abortion clinics, gun stores, mental and health issues including sexually transmitted disease clinics, and connections to faith organizations. Gen. Nakasone’s letter to Sen. Wyden states that NSA works to minimize the collection of such information. He writes that NSA does not buy location information from phones inside the United States, or purchase the voluminous information collected by our increasingly data-hungry automobiles. It would be a mistake, however, to interpret NSA’s internal restrictions too broadly. While NSA is generally the source for signals intelligence for the other agencies, the FBI, IRS, and the Department of Homeland Security are known to make their own data purchases. In 2020, PPSA reported on the Pentagon purchasing data from Muslim dating and prayer apps. In 2021, Sen. Wyden revealed that the Defense Intelligence Agency was purchasing Americans’ location data from our smartphones without a warrant. How much data, and what kinds of data, are purchased by the FBI is not clear. Sen. Wyden did succeed in a hearing last March in prompting FBI Director Christopher Wray to admit that the FBI had, in some period in the recent past, purchased location data from Americans’ smartphones without a warrant. Despite a U.S. Supreme Court opinion, Carpenter (2018), which held that the U.S. Constitution requires a warrant for the government to compel telecom companies to turn over Americans’ location data, federal agencies maintain that the Carpenter standard does not curb their ability to purchase commercially available digital information. In a press statement, Sen. Wyden hammers home the point that a recent Federal Trade Commission order bans X-Mode Social, a data broker, and its successor company, from selling Americans’ location data to government contractors. Another data broker, InMarket Media, must notify customers before it can sell their precise location data to the government. We now have to ask: was Wednesday’s revelation that the Biden Administration is drafting rules to prevent the sale of Americans’ data to hostile foreign governments an attempt by the administration to partly get ahead of a breaking story? For Americans concerned about privacy, the stakes are high. “Geolocation data can reveal not just where a person lives and whom they spend time with but also, for example, which medical treatments they seek and where they worship,” FTC Chair Lina Khan said in a statement. “The FTC’s action against X-Mode makes clear that businesses do not have free license to market and sell Americans’ sensitive location data. By securing a first-ever ban on the use and sale of sensitive location data, the FTC is continuing its critical work to protect Americans from intrusive data brokers and unchecked corporate surveillance.” As Sen. Wyden’s persistent digging reveals more details about government data purchases, Members of Congress are finding all the more reason to pass the Protect Liberty Act, which enforces the Constitution’s Fourth Amendment warrant requirement when the government inspects Americans’ purchased data. This should also put Members of the Senate and House Intelligence Committees on the spot. They should explain to their colleagues and constituents why they’ve done nothing about government purchases of Americans’ data – and why their bills include exactly nothing to protect Americans’ privacy under the Fourth Amendment. More to come … Well, better late than never. Bloomberg reports that the Biden Administration is preparing new rules to direct the U.S. Attorney General and Department of Homeland Security to restrict data transactions that sells our personal information – and even our DNA – to “countries of concern.”
Consider that much of the U.S. healthcare system relies on Chinese companies to sequence patients’ genomes. Under Chinese law, such companies are required to share their data with the government. The Office of the Director of National Intelligence warns that “Losing your DNA is not like losing a credit card. You can order a new credit card, but you cannot replace your DNA. The loss of your DNA not only affects you, but your relatives and, potentially, generations to come.” The order is also expected to crack down on data broker sales that could facilitate espionage or blackmail of key individuals serving in the federal government; it could be used to panic or distract key personnel in the event of a crisis; and collection of data on politicians, journalists, academics, and activists could deepen the impact of influence campaigns across the country. PPSA welcomes the development of this Biden rule. We note, however, that just like China, our own government routinely purchases Americans’ most sensitive and personal information from data brokers. These two issues – foreign access to commercially acquired data, and the access to this same information by the FBI, IRS, Department of Homeland Security, and other agencies – are related but separate issues that need to be addressed separately, the latter in the legislative process. The administration’s position on data purchases is contradictory. The administration also opposes closing the data-broker loophole in the United States. In the Section 702 debate, Biden officials say we would be at a disadvantage against China and other hostile countries that could still purchase Americans’ data. This new Biden Administration effort undercuts its argument. We should not emulate China’s surveillance practices any more than we practice their crackdowns against freedom of speech, religion, and other liberties. Still, this proposed rule against foreign data purchases is a step in the right direction, in itself and for highlighting the dire need for legislation to restrict the U.S. government’s purchase of its own citizens’ data. The Protect Liberty Act, which passed by the House Judiciary Committee by an overwhelming 35-2 vote to reauthorize Section 702, closes this loophole at home just as the Biden Administration seeks to close it abroad. So when the new Biden rule is promulgated, it should serve as a reminder to Congress that we have a problem with privacy at home as well. The Federal Reserve Board is publicly weighing whether or not to ask Congress to allow it to establish a Central Bank Digital Currency (CBDC), replacing paper dollars with government-issued electrons.
Given the growth of computing, a digital national currency may seem inevitable. But it would be a risky proposition from the standpoint of cybersecurity, national security, and unintended consequences for the economy. A CBDC would certainly pose a significant threat to Americans’ privacy. A factsheet on the Federal Reserve website says, “Any CBDC would need to strike an appropriate balance between safeguarding the privacy rights of consumers and affording the transparency necessary to deter criminal activity.” The Fed imagines that such a scheme would rely on privacy-sector intermediaries to create digital wallets and protect consumers’ privacy. Given the hunger that officialdom in Washington, D.C., has shown for pulling in all our financial information – including a serious proposal to record transactions from bank accounts, digital wallets, and apps – the Fed’s balancing of our privacy against surveillance of the currency is troubling. With digital money, government would have in its hands the ability to surveil all transactions, tracing every dollar from recipient to spender. Armed with such power, the government could debank any number of disfavored groups or individuals. If this sounds implausible, consider that debanking was exactly the strategy the Canadian government used against the trucker protestors two years ago. Enter H.R. 1122 – the CBDC Anti-Surveillance State Act – which sets down requirements for a digital currency. This bill would prohibit the Federal Reserve from using CBDC to implement monetary policy. It would require the Fed to report the results of a study or pilot program to Congress on a quarterly basis and consult with the brain trust of the Fed’s regional banks. Though this bill prevents the Fed from issuing CBDC accounts to individuals directly, there is a potential loophole in this bill – the Fed might still maintain CBDC accounts for corporations (the “intermediaries” the Fed refers to). The sponsors may want to close any loopholes there. That’s a quibble, however. This bill, sponsored by Rep. Tom Emmer (R-MO), Majority Whip of the House, with almost 80 co-sponsors, is a needed warning to the Fed and to surveillance hawks that a financial surveillance state is unacceptable. The American Civil Liberties Union, its Northern California chapter, and the Brennan Center, are calling on the Federal Trade Commission to investigate whether Meta and X have broken commitments they made to protect customers from data brokers and government surveillance.
This concern goes back to 2016 when it came to light that Facebook and Twitter helped police target Black Lives Matter activists. As a result of protests by the ACLU of Northern California and other advocacy groups, both companies promised to strengthen their anti-surveillance policies and cut off access to social media surveillance companies. Their privacy promises even became points of pride in these companies’ advertising. Now ACLU and Brennan say they have uncovered commercial documents from data brokers that seem to contradict these promises. They point to a host of data companies that publicly claim they have access to data from Meta and/or X, selling customers’ information to police and other government agencies. ACLU writes: “These materials suggest that law enforcement agencies are getting deep access to social media companies’ stores of data about people as they go about their daily lives.” While this case emerged from left-leaning organizations and concerns, organizations and people on the right have just as much reason for concern. The posts we make, what we say, who our friends are, can be very sensitive and personal information. “Something’s not right,” ACLU writes. “If these companies can really do all that they advertise, the FTC needs to figure out how.” At this point, we simply don’t know with certainty which, if any, social media platforms are permitting data brokers to obtain personal information from their platforms – information that can then be sold to the government. Regardless of the answer to that question, PPSA suggests that a thorough way to short-circuit any extraction of Americans’ most sensitive and personal information from data sales (at least at the federal level) would be to pass the strongly bipartisan Protect Liberty and End Warrantless Surveillance Act. This measure would force federal government agencies to obtain a warrant – as they should anyway under the Fourth Amendment – to access the data of an American citizen. PPSA has long warned that most drivers don’t realize that a modern car is a digital recording device. It tracks our travels, call logs, private text messages, even the impression our weight makes on our seat. Our car knows if we’re driving alone or with someone else. In all, a contemporary car accumulates vast amounts of data every day, much of it about us, where we’re going, and sometimes with whom.
Kashmir Hill in a recent New York Times piece described how a car can be turned into a digital weapon by a stalker or abusive partner. In one instance, a woman in divorce proceedings realized that her husband was tracking her through the location-based service in her Mercedes. When the woman visited a male friend, her husband sent the man a message with a thumbs-up emoji. Another woman, also estranged from her spouse, found that he was remotely causing her parked Tesla to turn on with heat blasting on hot days, and cold air streaming on cold days. Hill memorably wrote: “A car, to its driver, can feel like a sanctuary. A place to sing favorite songs off key, to cry, to vent or to drive somewhere no one knows you’re going.” That sanctuary, of course, is an illusion. Hill’s piece pointed not just to stalkers, but to the sharing of drivers’ consumer data with insurance companies and car companies. PPSA has long warned of yet another sinister use of car-generated data. About a dozen federal law enforcement and intelligence agencies make free use of the data broker loophole to purchase consumer data scraped from our apps. There is no law or rule that forbids them from purchasing car-generated data as well. This vulnerability will only get worse if a Congressional mandate for a built-in drunk driver detection system leads to cameras and microphones allowing AI to passively monitor drivers’ movements and speech for signs of impairment. Sens. Ron Wyden (D-OR) and Cynthia Lummis (R-WY), and Rep. Ro Khanna (D-CA), have addressed what government can do with car data under proposed legislation, “Closing the Warrantless Digital Car Search Loophole Act.” This bill would require law enforcement to obtain a warrant based on probable cause before searching data from any vehicle that does not require a commercial license. Another similar solution for all purchased commercial data is contained in the Protect Liberty and End Warrantless Surveillance Act, which passed the House Judiciary Committee with overwhelming bipartisan support. The most maddening thing about all this car-generated data is that much of it is off-limits to the drivers themselves, especially if someone else (like an ex-spouse) owns the car’s title. Cars are driving the expectation of privacy off the road. It is time for Congress to act. Less consumer tracking leads to less fraud. That’s the key takeaway from a new study conducted by the National Bureau of Economic Research in its working paper, “Consumer Surveillance and Financial Fraud.”
Using data obtained from the Federal Trade Commission and the Consumer Financial Protection Bureau, as well as the geospatial data firm Safegraph, the authors looked at the correlation between Apple’s App Tracking Transparency framework and consumer fraud reports. Apple’s ATT policy requires user authorization before other apps can track and share customer data. In April 2021, Apple made this the default setting on all iPhones, ensuring that users would no longer be automatically tracked when they visit websites or use apps. This in turn dealt a hefty financial blow to companies like Snap, Facebook, Twitter, and YouTube, which collectively lost about $10 billion after implementation. The authors of the paper obtained fraud complaint figures from the FTC and the CFPB, then employed machine learning and targeted keyword searches to isolate complaints stemming from data privacy issues. They then cross-referenced those complaints with data acquired by Safegraph showing the number of iPhone users in a given ZIP code. According to the paper, a 10% increase in Apple users within a given ZIP code leads to a 3.21% reduction in financial fraud complaints. As the Electronic Frontier Foundation points out in a recent article about the study: “While the scope of the data is small, this is the first significant research we’ve seen that connects increased privacy with decreased fraud. This should matter to all of us. It reinforces that when companies take steps to protect our privacy, they also help protect us from financial fraud.” Obviously, more companies should follow Apple’s lead in implementing ATT-like policies. More than that, however, we need better and more robust laws on the books protecting consumer privacy. California has passed a number of related bills in recent years, most recently creating a one-stop opt-out mechanism for data collection. Colorado did the same. As other states and nations (and even CIA agents) wake up to the dangers of data tracking, this new study can serve as compelling, direct evidence showing why more restrictive settings – and consumer privacy – should always be the default. PPSA has often covered abuses of the geolocation tracking common to cellphones – from local governments in California spying on church-goers, to “warrant factories” in Virginia in which police obtain hundreds of warrants for thousands of surveillance days, often for minor infractions.
Geolocation tracking can be among the most pernicious compromises of personal privacy. In Carpenter v. United States (2018), the U.S. Supreme Court held that warrants are needed to inspect cellphone records extracted from cell-site towers, recognizing just how personal a target’s movements can be. Writing for the majority, Chief Justice John Roberts wrote: “Unlike the nosy neighbor who keeps an eye on comings and goings, they [new technologies] are ever alert, and memory is nearly infallible.” The narrowness of Carpenter has not, however, prevented the FBI and other federal agencies from tracking people’s movements without a warrant by merely buying their data from third-party data brokers. The FBI may soon, however, have much less to buy. Orin Kerr, writing in the Volokh Conspiracy in Reason, informs us that “Google will no longer keep location history even for the users who opted to have it turned on. Instead, the location history will only be kept on the user’s phones.” Kerr adds: “If Google doesn’t keep the records, Google will have no records to turn over.” A corporate decision in Silicon Valley has thus removed a major pillar of government surveillance. It says something about the current state of this country when a Big Tech giant is more responsive to consumers than government is to its citizens. But don’t be surprised if the feds start to pressure Google to reverse its decision. WSJ Graphical Roadmap: How Your Personal Information Migrates from App, to Broker, to the Government12/5/2023
A report in The Wall Street Journal does a masterful job of combining graphics and text to illustrate how technology embedded in our phones and computers to serve up ads also enables government surveillance of the American citizenry.
The WSJ has identified and mapped out a network of brokers and advertising exchanges whose data flows from apps to Defense Department, intelligence agencies, and the FBI. The WSJ has compiled this information into several illustrative animated graphs that bring the whole scheme to life. Here’s how it works: As soon as you open an ad-supported app on your phone, data from your device is recorded and transmitted to buyers. The moment before an app serves you an ad, all advertisers in the bidding process are given access to information about your device. The first information up for bids is your location, IP address, device, and browser type. Ad services also record information about your interests and develop intricate assumptions about you. Many data brokers regularly sell Americans’ information to the government, where it may be used for cybersecurity, counterterrorism, counterintelligence, and public safety – or whatever a federal agency deems as such. Polls show that Americans are increasingly concerned about their digital privacy but are also fatalistic and unaware about their privacy options as consumers. According to a recent poll by Pew published last month, 81 percent of U.S. adults are concerned about how companies use the data they collect. Seventy-one percent are concerned about how the government uses their data, up from 64 percent in 2019. There is also an increasing feeling of helplessness: 73 percent of adults say they have little to no control over what companies do with their data, while 79 percent feel the same towards the government. The number of concerned Americans rises to 89 percent when the issue of children’s online privacy is polled. Crucially, 72 percent of Americans believe there should be more regulation governing the use of digital data. Despite high levels of concern, nearly 60 percent of Americans do not read the privacy policies of apps and social media services they use. Most Americans do not have the time or legal expertise to carefully study every privacy policy they encounter. Given that one must accept these terms or not be online, it is simply impractical to expect Americans do so. Yet government agencies assert that it is acceptable to collect and review Americans’ most personal data without a warrant because we have knowingly signed away our rights. There is good news. In the struggle for government surveillance reform currently taking place on Capitol Hill – and the introduction of the Protect Liberty and End Warrantless Surveillance Act – Americans are getting a better understanding of the costs of being treated as digital chattel by data brokers and government. |
Categories
All
|