From your browsing history to your physical location, every aspect of your digital footprint can be tracked and used to build a comprehensive profile of your private life – including your political, religious, and family activities, as well as the most intimate details of your personal life. This information is invaluable not only to advertisers – which want to place ads in your social media feeds – but also to governments, which often have malevolent intentions.
Hostile governments might weaponize your personal digital trail for blackmail or embarrassment. Imagine a CEO or inventor being blackmailed into revealing trade secrets. Or, if you work in the military or in an agency for a contractor involved in national security, your personal data might be used to disrupt your life during the beginning of an international crisis. Imagine a CIA officer receiving what appears to be an urgent message of distress from her daughter or an Air Force officer being told in the voice of his commanding officer to not go to the base but to shelter in place. And then multiply that effect by the millions of Americans in the crosshairs of a cyberattack. Congress and the Biden Administration acted against these possibilities this spring by including in the Israel/Ukraine weapons appropriation measure a provision banning data brokers from exporting Americans' personal data to China, Russia, North Korea, and Iran. However, this ban had notable loopholes. Adversary countries could still purchase data indirectly through middlemen data brokers in third countries or establish front companies to circumvent the ban. To attempt to close these loopholes, Sens. Ron Wyden (D-OR) and Cynthia Lummis (R-WY) have offered an amendment to the National Defense Authorization Act to further tighten the law by restricting data exports to problematic countries identified by the Secretary of Commerce that lack robust privacy laws to protect Americans' data from being sold and exported to adversaries. This measure will help reduce the flow of Americans’ personal data through third-parties and middlemen ultimately to regimes that have nothing but the worst of intentions. PPSA applauds Sens. Wyden and Lummis for working to tighten the pipeline of Americans’ data flowing out into the world. Their proposal is a needed one and deserves the vocal support of every American who cares about privacy. The Quick Unlocking of Would-Be Trump Assassin’s Phone Reveals Power of Commercial Surveillance7/18/2024
Since 2015, Apple’s refusal to grant the FBI a backdoor to its encrypted software on the iPhone has been a matter of heated debate. When William Barr was the U.S. Attorney General, he accused Apple of failing to provide “substantive assistance” in the aftermath of mass shootings by helping the FBI break into the criminals’ phones.
Then in a case in 2020, the FBI announced it had broken into an Apple phone in just such a case. Barr said: “Thanks to the great work of the FBI – and no thanks to Apple …” Clearly, the FBI had found a workaround, though it took the bureau months to achieve it. Gaby Del Valle in The Verge offers a gripping account of the back-and-forth between law enforcement and technologists resulting, she writes, in the widespread adoption of mobile device extraction tools that now allow police to easily break open mobile phones. It was known that this technology, often using Israeli-made Cellebrite software, was becoming ever-more prolific. Still, observers did a double-take when the FBI announced that its lab in Quantico, Virginia, was able to break into the phone of Thomas Matthew Crooks, who tried to assassinate former President Trump on Saturday, in just two days. More than 2,000 law enforcement agencies in every state had access to such mobile device extraction tools as of 2020. The most effective of these tools cost between $15,000 and $30,000. It is likely, as with cell-site simulators that can spoof cellphones into giving up their data, that these phone-breaking tools are purchased by state and local law enforcement with federal grants. We noticed recently that Tech Dirt reported that for $100,000 you could have purchased a cell-site simulator of your very own on eBay. The model was old, vintage 2004, and is not likely to work well against contemporary phones. No telling what one could buy in a more sophisticated market. The takeaway is that the free market created encryption and privacy for customer safety, privacy, and convenience. The ingenuity of technologists responding to market demand from government agencies is now being used to tear down consumer encryption, one of their greatest achievements. We reported earlier this month that Los Angeles police are alarmed at the proliferation of wireless cameras installed in bushes that allow criminals to remotely surveil homes targeted for burglaries.
Now police in Braintree, Massachusetts, have arrested two men and a woman in connection to a series of burglaries enabled by these remote, wireless cameras. One of the suspects, a Colombian man wearing all black and a mask, was arrested and charged with resisting arrest and assault and battery on a police officer, after attempting to flee when he was allegedly caught retrieving a wireless camera in front of a home that had been burgled. The three people arrested are, according to Braintree police, connected to a group known as the South American Theft Group, which uses extensive surveillance, GPS tracking technology, and counter-surveillance measures to analyze the comings and goings of their victims. The commoditization of spyware and the popularization of sophisticated plans for surveillance is driving this revolution in neighborhood crime. What can we do? In addition to the customary precautions of installing locks and alarms, outdoor lights, and installing security cameras, you should avoid posting advance notice of family vacations. Criminals are watching your social media posts as well. The recent passage of the Fourth Amendment Is Not For Sale Act by the House marks a bold and momentous step toward protecting Americans' privacy from unwarranted government intrusion. This legislation mandates that federal law enforcement and intelligence agencies, such as the FBI and CIA, must obtain a probable cause warrant before purchasing Americans’ personal data from brokers. This requirement closes a loophole that allows agencies to compromise the privacy of Americans and bypass constitutional safeguards.
While this act primarily targets law enforcement and intelligence agencies, it is crucial to extend these protections to all federal agencies. Non-law enforcement entities like the Treasury Department, IRS, and Department of Health and Human Services are equally involved in the purchase of Americans' personal data. The growing appetite among these agencies to track citizens' financial data, sensitive medical issues, and personal lives highlights the need for a comprehensive warrant requirement across the federal government. How strong is that appetite? The Financial Crimes Enforcement Network (FinCEN), operating under the Treasury Department, exemplifies the ambitious scope of federal surveillance. Through initiatives like the Corporate Transparency Act, FinCEN now requires small businesses to disclose information about their owners. This data collection is ostensibly for combating money laundering, though it seems unlikely that the cut-outs and money launderers for cocaine dealers and human traffickers will hesitate to lie on an official form. This data collection does pose significant privacy risks by giving multiple federal agencies warrantless access to a vast database of personal information of Americans who have done nothing wrong. The potential consequences of such data collection are severe. The National Small Business Association reports that the Corporate Transparency Act could criminalize small business owners for simple mistakes in reporting, with penalties including fines and up to two years in prison. This overreach underscores the broader issue of federal agencies wielding excessive surveillance powers without adequate checks and balances. Another alarming example is the dragnet financial surveillance revealed by the House Judiciary Committee and its Select Subcommittee on the Weaponization of the Federal Government. The FBI, in collaboration with major financial institutions, conducted sweeping investigations into individuals' financial transactions based on perceptions of their political leanings. This surveillance was conducted without probable cause or warrants, targeting ordinary Americans for exercising their constitutional rights. Without statutory guardrails, such surveillance could be picked up by non-law enforcement agencies like FinCEN, using purchased digital data. These examples demonstrate the appetite of all government agencies for our personal information. Allowing them to also buy our most sensitive and personal information from data brokers, which is happening now, is about an absolute violation of Americans’ privacy as one can imagine. Only listening devices in every home could be more intrusive. Such practices are reminiscent of general warrants of the colonial era, the very abuses the Fourth Amendment was designed to prevent. The indiscriminate collection and scrutiny of personal data without individualized suspicion erode the foundational principles of privacy and due process. The Fourth Amendment Is Not For Sale Act is a powerful and necessary step to end these abuses. Congress should also consider broadening the scope to ensure all federal agencies are held to the same standard. We’ve long recounted the bad news on law enforcement’s use of facial recognition software – how it misidentifies people and labels them as criminals, particularly people of color. But there is good news on this subject for once: the Detroit Police Department has reached a settlement with a man falsely arrested on the basis of a bad match from facial recognition technology (FRT) that includes what many civil libertarians are hailing as a new national standard for police.
The list of injustices from false positives from FRT has grown in recent years. We told the story of Randall Reid, a Black man in Georgia, arrested for the theft of luxury goods in Louisiana. Even though Reid had never been to Louisiana, he was held in jail for a week. We told the story of Porchia Woodruff, a Detroit woman eight months pregnant, who was arrested in her driveway while her children cried. Her purported crime was – get this – a recent carjacking. Woodruff had to be rushed to the hospital after suffering contractions in her holding cell. Detroit had a particularly bad run of such misuses of facial recognition in criminal investigations. One of them was the arrest of Robert Williams in 2020 for the 2018 theft of five watches from a boutique store in which the thief was caught on a surveillance camera. Williams spent 30 hours in jail. Backed by the American Civil Liberties Union, the ACLU of Michigan, and the University of Michigan Civil Rights Litigation Initiative, Williams sued the police for wrongful arrest. In an agreement blessed by a federal court in Michigan, Williams received a generous settlement from the Detroit police. What is most important about this settlement agreement are the new rules Detroit has embraced. From now on:
Another series of reforms impose discipline on the way in which lineups of suspects or their images unfold. When witnesses perform lineup identifications, they may not be told that FRT was used as an investigative lead. Witnesses must report how confident they are about any identification. Officers showing images to a witness must themselves not know who the real suspect is, so they don’t mislead the witness with subtle, non-verbal clues. And photos of suspects must be shown one at a time, instead of showing all the photos at once – potentially leading a witness to select the one image that merely has the closest resemblance to the suspect. Perhaps most importantly, Detroit police officers will be trained on the proper uses of facial recognition and eyewitness identification. “The pipeline of ‘get a picture, slap it in a lineup’ will end,” Phil Mayor, a lawyer for the ACLU of Michigan told The New York Times. “This settlement moves the Detroit Police Department from being the best-documented misuser of facial recognition technology into a national leader in having guardrails in its use.” PPSA applauds the Detroit Police Department and ACLU for crafting standards that deserve to be adopted by police departments across the United States. A new online surveillance danger to human rights, free expression, and liberty is emerging in the online world. This particular threat is not coming from Moscow or Beijing, but inexplicably from America’s own trade representative, Katherine Tai.
Until now, the open architecture of the internet has made it difficult for illiberal governments, ranging from Uganda to Venezuela, to access what is posted and shared by dissidents, members of vulnerable minorities, and disgruntled citizens. Dictators have many surveillance workarounds at their disposal, including increasingly robust spyware spreading around the globe like wildfire. But at least the open architecture of the internet makes it difficult for dictators and persecutors to confidently track or trace every text, email, and online search within their borders. It is thus out of a commitment to democracy and human rights that U.S. administrations and trade representatives have long strived to defend U.S. tech companies from being required to turn over data to be stored on local servers, which would Balkanize the internet. The United States also rejected requests from regimes that would compel U.S. tech companies to turn over their proprietary source codes so foreign governments could access the algorithms of messaging apps and digital platforms, potentially giving hostile actors access to the guts of their operations. Late last year, Trade Representative Tai withdrew support for these longstanding U.S. digital trade principles before the World Trade Organization, a vastly underreported story with consequences that are just now beginning to sink in. This will subject leading U.S. tech companies to strict regulation in virtually every market in the world. That’s an odd position for America’s trade representative – who is usually expected to safeguard the competitiveness of American companies. The consequences for privacy and human rights promise to be catastrophic. The Center for Democracy & Technology penned a coalition letter in February that itemized these negative consequences of Tai’s about-face. People’s personal data, the letter states, “can reveal who they voted for, who they worship, and who they love.” Data localization would upend a globally interoperable internet, placing this personal data firmly within reach of governments, “creating unique risks for people’s privacy, free expression, access to information, and other fundamental freedoms.” Restrictions of cross-border flows of information will restrict the ability of people to access information from around the world. And the forced disclosure of products’ source code has the potential to undermine privacy and security here in the United States. Why is Tai doing this? Her actions appear to be the result of lobbying from Federal Trade Commission Chair Lina Khan and DOJ Antitrust Chief Jonathan Kanter, who are actively encouraging global antitrust actions against large U.S. tech companies. Even if you are critical of Google, Apple, Amazon, and Meta, inviting Myanmar and Uzbekistan to regulate U.S. industries is astonishing, to say the least. Clearly, this proposal wasn’t widely vetted. Nathaniel Fick, the State Department’s Ambassador for Cyberspace and Digital Policy, testified in a hearing late last year that he learned of this sea change in U.S. policy from press reports. There are signs that Tai’s surprise kicked off a fierce debate within the administration. To be fair, the internet is far from perfectly open as it is. Some countries like India already require a degree of data localization. The Biden Administration’s effort to protect Americans’ personal data from hostile “countries of concern” like Russia and China will be portrayed by some as a step toward Balkanization. But Tai’s policy reversal kicks this trend into overdrive. It will enable foreign governments to surveil democracy activists and dissidents around the world, while heightening threats to Americans at home. This is a monumental shift in American policy. It must be more widely discussed, debated, and investigated by journalists and Members of Congress. State financial officials in 23 states have fired off a letter to House Speaker Mike Johnson expressing strong opposition to a new Security and Exchange Commission program that grants 3,000 government employees real-time access to every equity, option trade, and quote from every account of every broker by every investor.
“Traditionally, Americans’ financial holdings are kept between them and their broker, not them, their broker, and a massive government database,” the state auditors and treasurers wrote. “The only exception has been legal investigations with a warrant." The state financial officers contend that the SEC's move undermines the principles of federalism by imposing a one-size-fits-all solution without considering the unique regulatory environments of individual states. They asked Speaker Johnson to support a bill sponsored by Rep. Barry Loudermilk (R-GA), the Protecting Investors' Personally Identifiable Information Act. This proposed legislation would restrict the SEC's ability to collect and centralize such vast amounts of personal financial data. As is so common with recent efforts at financial surveillance, the SEC justifies this data collection to combat insider trading, market manipulation, and to identify suspicious activities. Similar excuses are offered for the new “beneficial ownership” requirement that is forcing millions of Americans who own small businesses to send the ownership details of their businesses to the Financial Crimes Enforcement Network (FinCEN) of the U.S. Treasury. But such increased vigilance comes at the expense of the privacy of millions of Americans. The sheer volume of data accessible to government employees raises concerns about potential misuse and unauthorized access. “The Securities and Exchange Commission has been barreling forward with a new system – the Consolidated Audit Trail (CAT) – which tracks every trade an individual investor makes and links it to their identity through a centralized system,” Rep. Loudermilk said. “Not only is collecting all this information unnecessary, regulators already have similar systems that don’t easily match identities with transactions, but it also creates another security vulnerability and a target for hackers.” While the SEC assures lawmakers that strict safeguards are in place – given recent high-profile hacks and All the more reason for Speaker Johnson to give Rep. Loudermilk’s bill a big push on the House floor. The doxing of donors is a danger to our democracy.
When donors give to a controversial cause, they count on anonymity to protect them from public backlash. This is a principle enshrined in law since 1958, when the U.S. Supreme Court protected donors to the NAACP from forcible disclosure by the State of Alabama. Undeterred by this precedent, California tried to enforce a measure to capture the identities of donors and hold them in the office of that state’s attorney general, despite the fact that the California AG’s office has a history of leaks and data breaches. Surprisingly, the federal Ninth Circuit upheld that plan. The Project for Privacy and Surveillance Accountability filed a brief before the U.S. Supreme Court arguing that this policy is dangerous, not just to the robust practice of democracy, but to human lives. Citizens have lost their jobs, had their businesses threatened, and even been targeted for physical violence, all because they donated to a political or cultural cause. In 2021, the Supreme Court agreed with PPSA, reversing a Ninth Circuit opinion in Americans for Prosperity v. Bonta. Still, the drive to expose donors – whether progressives going after gun rights organizations or conservatives going after protest organizations – remains a hot-button issue in state politics across the country. Politicians and groups are eager to know: Is George Soros or the Koch Foundation or name-your-favorite-nemesis giving money to a cause you oppose? Thanks to the work of the People United For Privacy (PUFP) foundation, that push to expose is now stopped cold in 20 states. With help from PUFP, bipartisan coalitions in 20 states have adopted the Personal Privacy Protection Act (PPPA) to provide a shield for donor privacy by protecting their anonymity. This movement is spreading across the country, with Alabama, Colorado, and Nebraska having passed some version of this law just this year. “Every American has the right to support causes they believe in without fear of harassment or abuse of their personal information,” says Heather Lauer, who heads People United for Privacy. “The PPPA is a commonsense measure embraced by lawmakers in both parties across the ideological spectrum.” Supporters have ranged from state chapters of the ACLU, NAACP, and Planned Parenthood to pro-life groups, gun rights groups, and free market think tanks. Thanks to this campaign, 40 percent of states now protect donors. For the remaining 60 percent, the power of the internet can expose donors’ home addresses, places of work, family members, and other private information to harassers. The need to enact this law in the remaining 30 states is urgent. Still, securing donor protection in 20 states is a remarkable record given that People United for Privacy was only founded in 2018. We look forward to supporting their efforts and seeing more wins for privacy in the next few years. George Orwell wrote that in a time of deceit, telling the truth is a revolutionary act.
Revolutionary acts of truth-telling are becoming progressively more dangerous around the world. This is especially true as autocratic countries and weak democracies purchase AI software from China to weave together surveillance technology to comprehensively track individuals, following them as they meet acquaintances and share information. A piece by Abi Olvera posted by the Bulletin of Atomic Scientists describes this growing use of AI to surveil populations. Olvera reports that by 2019, 56 out of 176 countries were already using artificial intelligence to weave together surveillance data streams. These systems are increasingly being used to analyze the actions of crowds, track individuals across camera views, and pierce the use of masks or scramblers intended to disguise faces. The only impediment to effective use of this technology is the frequent Brazil-like incompetence of domestic intelligence agencies. Olvera writes: “Among other things, frail non-democratic governments can use AI-enabled monitoring to detect and track individuals and deter civil disobedience before it begins, thereby bolstering their authority. These systems offer cash-strapped autocracies and weak democracies the deterrent power of a police or military patrol without needing to pay for, or manage, a patrol force …” Olvera quotes AI surveillance expert Martin Beraja that AI can enable autocracies to “end up looking less violent because they have better technology for chilling unrest before it happens.” Olivia Solon of Bloomberg reports on the uses of biometric identifiers in Africa, which are regarded by the United Nations and World Bank as a quick and easy way to establish identities where licenses, passports, and other ID cards are hard to come by. But in Uganda, Solon reports, President Yoweri Museveni – in power for 40 years – is using this system to track his critics and political opponents of his rule. Used to catch criminals, biometrics is also being used to criminalize Ugandan dissidents and rival politicians for “misuse of social media” and sharing “malicious information.” The United States needs to lead by example. As our facial recognition and other systems grow in ubiquity, Congress and the states need to demonstrate our ability to impose limits on public surveillance, and legal guardrails for the uses of the sensitive information they generate. Every moral person agrees we must fight the sexual abuse of children online. But a renewed push by the Belgian Presidency within the European Union’s executive branch would force all consumers to accept software that would annihilate any semblance of communications privacy. This would be done with government technology that would break end-to-end encryption. (Hat tip to Joe Mullin of EFF.)
In the name of catching those who traffic in Child Sexual Abuse Materials (CSAM), the EU is poised to degrade the ability of anyone to privately communicate. Worse, it could enable illicit and dangerous surveillance by bad actors. The EU had previously proposed scanning the full content of encrypted messages. In what is being sold as a new approach, the executive branch is now offering a tweaked but still problematic approach called “upload moderation.” This proposal would mandate the scanning of hyperlinks and images within encrypted messages. In theory, consumers could refuse to consent to this snooping, but they would be blocked from sharing any further photos or videos. Such coerced consent is, of course, no consent at all. What is lost in this debate is that encryption is a major protector of personal security, human rights, and liberty. In an open letter to the EU, leading civil liberties organizations – including the Center for Democracy & Technology, Mozilla, and the Electronic Frontier Foundation – warn policymakers that such technology would be dangerous “bugs in our pockets.” Such “client-side scanning” pushes surveillance beyond what is shared on the cloud directly to the user’s device. Some trolls already threaten journalists by sending them unwanted CSAM. Dictatorships could use Europe’s system to send innocuous images to dissidents that contain the correct parameters to trigger a CSAM alarm – and then use the results of that alarm to locate that person. Cartels and other criminal gangs could use it to locate witnesses. Experts demonstrate that malevolent agents can manipulate the hash database of such a system to transform it into a risk for physically locating and surveilling individuals. Victims around the world could ironically include women and children hiding in safe houses from abusers and stalkers. CSAM users are despicable criminals who deserve to be ferreted out and punished. But creating a system that eradicates all privacy in electronic communications is not the solution. Now that the House has passed the Fourth Amendment Is Not for Sale Act, senators would do well to review new concessions from the intelligence community on how it treats Americans’ purchased data. This is progress, but it points to how much more needs to be done to protect privacy.
Avril Haines, Director of National Intelligence (DNI), released a “Policy Framework for Commercially Available Information,” or CAI. In plain English, CAI is all the digital data scraped from our apps and sold to federal agencies, ranging from the FBI to the IRS, Department of Homeland Security, and Department of Defense. From purchased digital data, federal agents can instantly access almost every detail of our personal lives, from our relationships to our location histories, to data about our health, financial stability, religious practices, and politics. Federal purchases of Americans’ data don’t merely violate Americans’ privacy, they kick down any semblance of it. There are signs that the intelligence community itself is coming to realize just how extreme its practices are. Last summer, Director Haines released an unusually frank report from an internal panel about the dangers of CAI. We wrote at the time: “Unlike most government documents, this report is remarkably self-aware and willing to explore the dangers” of data purchases. The panel admitted that this data can be used to “facilitate blackmail, stalking, harassment, and public shaming.” Director Haines’ new policy orders all 18 intelligence agencies to devise safeguards “tailored to the sensitivity of the information” and produce an annual report on how each agency uses such data. The policy also requires agencies:
Details for how each of the intelligence agencies will fulfill these aspirations – and actually handle “sensitive CAI” – is left up to them. Sen. Ron Wyden (D-OR) acknowledged that this new policy marks “an important step forward in starting to bring the intelligence community under a set of principles and polices, and in documenting all the various programs so that they can be overseen.” Journalist and author Byron Tau told Reason that the new policy is a notable change in the government’s stance. Earlier, “government lawyers were saying basically it’s anonymized, so no privacy problem here.” Critics were quick to point out that any of this data could be deanonymized with a few keystrokes. Now, Tau says, the new policy is “sort of a recognition that this data is actually sensitive, which is a bit of change.” Tau has it right – this is a bit of a change, but one with potentially big consequences. One of those consequences is that the public and Congress will have metrics that are at least suggestive of what data the intelligence community is purchasing and how it uses it. In the meantime, Sen. Wyden says, the framework of the new policy has an “absence of clear rules about what commercially available information can and cannot be purchased by the intelligence community.” Sen. Wyden adds that this absence “reinforces the need for Congress to pass legislation protecting the rights of Americans.” In other words, the Senate must pass the Fourth Amendment Is Not For Sale Act, which would subject purchased data to the same standard as any other personal information – a probable cause warrant. That alone would clarify all the rules of the intelligence community. The federal government’s hunger for financial surveillance is boundless. A central bank digital currency (CBDC) would completely satisfy it. Under a CBDC, all transactions would be recorded, giving federal agencies the means to review any Americans’ income and expenditures at a glance. Financial privacy would not be compromised: it would be dead.
Federal Reserve Chairman Jerome Powell says this country is “nowhere near” establishing a digital currency. To be sure, such an undertaking would take years. But Nigeria, Jamaica, and the Bahamas already have digital currencies. China is well along in a pilot program for a digital yuan. The U.S. government is actively exploring this as an option. It is not too early to consider the consequences of a digital dollar. Such a digital currency would create a presumably unbreakable code, or “blocks” linked together by cryptographic algorithms, to connect computers to create a digital ledger to record transactions. Some risks of a CBDC are obvious – from the breaking of “unbreakable” codes by criminals and hostile foreign governments, to the temptation for Washington, D.C., to expand the currency with a few clicks, making it all the easier to inflate the currency. House Majority Whip, Rep. Tom Emmer (R-MN), is especially concerned about the privacy implications of a digital currency. “If not designed to be open, permissionless, and private – emulating cash – a government-issued CBDC is nothing more than a CCP-style (Chinese Communist Party) surveillance tool that would be used to undermine the American way of life,” Rep. Emmer said. He is expected to soon reintroduce a bill that would require any central bank digital currency to require authorizing legislation from Congress before it could be enacted. Emmer’s stand is prescient, not premature. From the new requirement for “beneficial ownership” forms by small businesses, to the revelation from House hearings of warrantless, dragnet surveillance through credit card and ATM transactions, the federal government is inventing new ways to track our every financial move. Rep. Emmer is right to head this one off at the pass. PPSA endorses this bill and urges Emmer’s colleagues to pass it into law. A new fiat currency should have the permission of Congress and the American people. The risks and benefits of reverse searches are revealed in the capital murder case of Aaron Rayshan Wells. Although a security camera recorded a number of armed men entering a home in Texas where a murder took place, the lower portions of the men’s faces were covered. Wells was identified in this murder investigation by a reverse search enabled by geofencing.
A lower court upheld the geofence in this case as sufficiently narrow. It was near the location of a homicide and was within a precise timeframe on the day of the crime, 2:45-3:10 a.m. But ACLU in a recent amicus brief identifies dangers with this reverse search, even within such strict limits. What are the principles at stake in this practice? Let’s start with the Fourth Amendment, which places hurdles government agents must clear before obtaining a warrant for a search – “no warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.” The founders’ tight language was formed by experience. In colonial times, the King’s agents could act on a suspicion of smuggling by ransacking the homes of all the shippers in Boston. Forcing the government to name a place, and a person or thing to be seized and searched, was the founders’ neat solution to outlawing such general warrants altogether. It was an ingenious system, and it worked well until Michael Dimino came along. In 1995, this inventor received a patent for using GPS to locate cellphones. Within a few years, geofencing technology could instantly locate all the people with cellphones within a designated boundary at a specified time. This was a jackpot for law enforcement. If a bank robber was believed to have blended into a crowd, detectives could geofence that area and collect the phone numbers of everyone in that vicinity. Make a request to a telecom service provider, run computer checks on criminals with priors, and voilà, you have your suspect. Thus the technology-enabled practice of conducting a “reverse search” kicked into high gear. Multiple technologies assist in geofenced investigations. One is a “tower dump,” giving law enforcement access to records of all the devices connected to a specified cell tower during a period of time. Wi-Fi is also useful for geofencing. When people connect their smartphones to Wi-Fi networks, they leave an exact log of their physical movements. Our Wi-Fi data also record our online searches, which can detail our health, mental health, and financial issues, as well intimate relationships, and political and religious activities and beliefs. A new avenue for geofencing was created on Monday by President Biden when he signed into a law a new measure that will give the government the ability to tap into data centers. The government can now enlist the secret cooperation of the provider of “any” service with access to communications equipment. This gives the FBI, U.S. intelligence agencies, and potentially local law enforcement a wide, new field with which to conduct reverse searches based on location data. In these ways, modern technology imparts an instant, all-around understanding of hundreds of people in a targeted area, at a level of intimacy that Colonel John André could not have imagined. The only mystery is why criminals persist in carrying their phones with them when they commit crimes. Google was law enforcement’s ultimate go-to in geofencing. Warrants from magistrates authorizing geofence searches allowed the police to obtain personal location data from Google about large numbers of mobile-device users in a given area. Without any further judicial oversight, the breadth of the original warrant was routinely expanded or narrowed in private negotiations between the police and Google. In 2023, Google ended its storage of data that made geofencing possible. Google did this by shifting the storage of location data from its servers to users’ phones. For good measure, Google encrypted this data. But many avenues remain for a reverse search. On one hand, it is amazing that technology can so rapidly identify suspects and potentially solve a crime. On the other, technology also enables dragnet searches that pull in scores of innocent people, and potentially makes their personal lives an open book to investigators. ACLU writes: “As a category, reverse searches are ripe for abuse both because our movements, curiosity, reading, and viewing are central to our autonomy and because the process through which these searches are generally done is flawed … Merely being proximate to criminal activity could make a person the target of a law enforcement investigation – including an intrusive search of their private data – and bring a police officer knocking on their door.” Virginia judge Mary Hannah Lauck in 2022 recognized this danger when she ruled that a geofence in Richmond violated the Fourth Amendment rights of hundreds of people in their apartments, in a senior center, people driving by, and in nearby stores and restaurants. Judge Lauck wrote “it is difficult to overstate the breadth of this warrant” and that an “innocent individual would seemingly have no realistic method to assert his or her privacy rights tangled within the warrant. Geofence warrants thus present the marked potential to implicate a ‘right without a remedy.’” ACLU is correct that reverse searches are obvious violations of the plain meaning of the Fourth Amendment. If courts continue to uphold this practice, however, strict limits need to be placed on the kinds of information collected, especially from the many innocent bystanders routinely caught up in geofencing and reverse searches. And any change in the breadth of a warrant should be determined by a judge, not in a secret deal with a tech company. Forbes reports that federal authorities were granted a court order to require Google to hand over the names, addresses, phone numbers, and user activities of internet surfers who were among the more than 30,000 viewers of a post. The government also obtained access to the IP addresses of people who weren’t logged onto the targeted account but did view its video.
The post in question is suspected of being used to promote the sale of bitcoin for cash, which would be a violation of money-laundering rules. The government likely had good reason to investigate that post. But did it have to track everyone who came into contact with it? This is a prime example of the government’s street-sweeper shotgun approach to surveillance. We saw this when law enforcement in Virginia tracked the location histories of everyone in the vicinity of a robbery. A state judge later found that search meant that everyone in the area, from restaurant patrons to residents of a retirement home, had “effectively been tailed.” We saw the government shotgun approach when the FBI secured the records of everyone in the Washington, D.C., area who used their debit or credit cards to make Bank of America ATM withdrawals between Jan. 5 and Jan. 7, 2021. We also saw it when the FBI, searching for possible foreign influence in a congressional campaign, used FISA Section 702 data – meant to surveil foreign threats on foreign soil – to pull the data of 19,000 political donors. Surfing the web is not inherently suspicious. What we watch online is highly personal, potentially revealing all manner of social, romantic, political, and religious beliefs and activities. The Founders had such dragnet-style searches precisely in mind when they crafted the Fourth Amendment. Simply watching a publicly posted video is not by itself probable cause for search. It should not compromise one’s Fourth Amendment rights. Byron Tau – journalist and author of Means of Control, How the Hidden Alliance of Tech and Government Is Creating a New American Surveillance State – discusses the details of his investigative reporting with Liza Goitein, senior director of the Brennan Center for Justice's Liberty & National Security Program, and Gene Schaerr, general counsel of the Project for Privacy and Surveillance Accountability.
Byron explains what he has learned about the shadowy world of government surveillance, including how federal agencies purchase Americans’ most personal and sensitive information from shadowy data brokers. He then asks Liza and Gene about reform proposals now before Congress in the FISA Section 702 debate, and how they would rein in these practices. End-to-end encryption, in which only the sender and recipient have access to a message, is the saving grace of the online world, the last little bit of privacy most of us can expect to have in this era of near-ubiquitous surveillance. Tens of billions of encrypted messages are sent every day between users of WhatsApp, Signal, Apple’s iMessage and many other apps.
The central importance of encryption to privacy is described in an amicus brief by the American Civil Liberties Union, the Center for Democracy and Technology, the Electronic Frontier Foundation, Mozilla, and several other activist groups and corporations. They stand in opposition to a preliminary injunction request by Nevada Attorney General Aaron Ford in his lawsuit to stop Meta from launching a new encrypted version of its Messenger app, ostensibly because it would pose a new threat to the safety of children. The facts are on Meta’s side. End-to-end encryption has been an optional feature of Messenger for eight years. Attorney General Ford ignores the host of other encrypted services millions of Americans use, singling out Meta as a test case. If he were to succeed in breaking open Messenger’s encryption, the attorney general would in essence be setting a precedent for the nation, maybe even for the world. The clear and passionate language of the civil liberties amicus brief gets to the heart of what is at stake: “Society has long recognized that people thrive when we have the ability to engage in private, unmonitored conversations. Sharing confidences enables people to form friendships and intimate relationships, obtain information about sensitive matters, and construct different identities depending on the audience. We know this from our own lives, whether engaging in pillow talk, meeting a friend for a walk, or forming an invitation-only club. Important, human things happen when we can be confident that no one is listening in.” Nothing about end-to-end encryption prevents law enforcement from accessing the message from either the recipient or the sender. But preventing companies from providing security, as the Nevada AG seeks to do, creates security risks from bad actors, including both criminals and government officials who would abuse their power by illegally accessing messages. The brief quotes respected child protection organizations that encrypted channels protect children from violent family members, stalkers, and predators, and in parts of the world where there is armed conflict. Hackers in 2015 stole five million customer details from a children’s technology and toy firm, including sensitive information. Because these chats between parents and children were unencrypted, the leak gave criminals the names, ages, and addresses of millions of children. The amici also write of just how onslaughts against privacy, like that of the Nevada Attorney General, break with American tradition. They write: “In any other era, a claim that government may obligate us to record and preserve our conversations, just in case investigators wanted to review them later, would be laughably ridiculous. It would simply have been beyond the pale to suggest that people could be required to record their conversations in a language that law enforcement could readily understand and access. Basic conversational privacy was assumed, and rightly so.” Legal precedent is also on the side of civil liberties. The Ninth Circuit recognized encryption’s importance “to reclaim some portion of the privacy we have lost” 25 years ago in rejecting the U.S. government’s export restrictions on strong cryptography. (See EFF on Bernstein v. Department of Justice) While advocates of privacy have a solid chance of prevailing in state court in Las Vegas, encryption is endangered across the pond by Section 122 of the United Kingdom’s Online Safety Act, passed in late 2023. The law requires companies to use technology that would scan users’ messages to make sure they are not transmitting illegal content, like Child Sexual Abuse Material. Doing this without breaking end-to-end encryption is currently impossible. The UK’s internet regulator, Ofcom, has relented in requiring content monitoring, for now, for the simple reason that such technology does not yet exist. With developments in AI, however, it might come sooner than we think. Matthew Hodgson, CEO of Element, told WIRED, that such scanning tech would undermine encryption and provide “a mechanism where bad actors of any kind could compromise the scanning system in order to steal the data flying around the place.” Anti-encryption regulators only to need to win in one jurisdiction to threaten the viability of encryption globally, from women and children hiding from abusive situations to dissidents living in dictatorships. From London to Las Vegas, encryption – and privacy – are at risk. A federal court has given the go-ahead for a lawsuit filed by Just Futures Law and Edelson PC against Western Union for its involvement in a dragnet surveillance program called the Transaction Record Analysis Center (TRAC).
Since 2022, PPSA has followed revelations on a unit of the Department of Homeland Security that accesses bulk data on Americans’ money wire transfers above $500. TRAC is the central clearinghouse for this warrantless information, recording wire transfers sent or received in Arizona, California, New Mexico, Texas, and Mexico. These personal, financial transactions are then made available to more than 600 law enforcement agencies – almost 150 million records – all without a warrant. Much of what we know about TRAC was unearthed by a joint investigation between ACLU and Sen. Ron Wyden (D-OR). In 2023, Gene Schaerr, PPSA general counsel, said: “This purely illegal program treats the Fourth Amendment as a dish rag.” Now a federal judge in Northern California determined that the plaintiffs in Just Future’s case allege plausible violations of California laws protecting the privacy of sensitive financial records. This is the first time a court has weighed in on the lawfulness of the TRAC program. We eagerly await revelations and a spirited challenge to this secretive program. The TRAC intrusion into Americans’ personal finances is by no means the only way the government spies on the financial activities of millions of innocent Americans. In February, a House investigation revealed that the U.S. Treasury’s Financial Crimes Enforcement Network (FinCEN) has worked with some of the largest banks and private financial institutions to spy on citizens’ personal transactions. Law enforcement and private financial institutions shared customers’ confidential information through a web portal that connects the federal government to 650 companies that comprise two-thirds of the U.S. domestic product and 35 million employees. TRAC is justified by being ostensibly about the border and the activities of cartels, but it sweeps in the transactions of millions of Americans sending payments from one U.S. state to another. FinCEN set out to track the financial activities of political extremists, but it pulls in the personal information of millions of Americans who have done nothing remotely suspicious. Groups on the left tend to be more concerned about TRAC and groups on the right, led by House Judiciary Chairman Jim Jordan, are concerned about the mass extraction of personal bank account information. The great thing about civil liberties groups today is their ability to look beyond ideological silos and work together as a coalition to protect the rights of all. For that reason, PPSA looks forward to reporting and blasting out what is revealed about TRAC in this case in open court. Any revelations from this case should sink in across both sides of the aisle in Congress, informing the debate over America’s growing surveillance state. The reform coalition on Capitol Hill remains determined to add strong amendments to Section 702 of the Foreign Intelligence Surveillance Act (FISA). But will they get the chance before an April 19th deadline for FISA Section 702’s reauthorization?
There are several possible scenarios as this deadline closes. One of them might be a vote on the newly introduced “Reforming Intelligence and Securing America” (RISA) Act. This bill is a good-faith effort to represent the narrow band of changes that the pro-reform House Judiciary Committee and the status quo-minded House Permanent Select Committee on Intelligence could agree upon. But is it enough? RISA is deeply lacking because it leaves out two key reforms.
The bill does include a role for amici curiae, specialists in civil liberties who would act as advisors to the secret FISA court. RISA, however, would limit the issues these advisors could address, well short of the intent of the Senate when it voted 77-19 in 2020 to approve the robust amici provisions of the Lee-Leahy amendment. For all these reasons, reformers should see RISA as a floor, not as a ceiling, as the Section 702 showdown approaches. The best solution to the current impasse is to stop denying Members of Congress the opportunity for a straight up-or-down vote on reform amendments. How to Tell if You are Being Tracked Car companies are collecting massive amounts of data about your driving – how fast you accelerate, how hard you brake, and any time you speed. These data are then analyzed by LexisNexis or another data broker to be parsed and sold to insurance companies. As a result, many drivers with clean records are surprised with sudden, large increases in their car insurance payments.
Kashmir Hill of The New York Times reports the case of a Seattle man whose insurance rates skyrocketed, only to discover that this was the result of LexisNexis compiling hundreds of pages on his driving habits. This is yet another feature of the dark side of the internet of things, the always-on, connected world we live in. For drivers, internet-enabled services like navigation, roadside assistance, and car apps are also 24-7 spies on our driving habits. We consent to this, Hill reports, “in fine print and murky privacy policies that few read.” One researcher at Mozilla told Hill that it is “impossible for consumers to try and understand” policies chocked full of legalese. The good news is that technology can make data gathering on our driving habits as transparent as we are to car and insurance companies. Hill advises:
What you cannot do, however, is file a report with the FBI, IRS, the Department of Homeland Security, or the Pentagon to see if government agencies are also purchasing your private driving data. Given that these federal agencies purchase nearly every electron of our personal data, scraped from apps and sold by data brokers, they may well have at their fingertips the ability to know what kind of driver you are. Unlike the private snoops, these federal agencies are also collecting your location histories, where you go, and by inference, who you meet for personal, religious, political, or other reasons. All this information about us can be accessed and reviewed at will by our government, no warrant needed. That is all the more reason to support the inclusion of the principles of the Fourth Amendment Is Not for Sale Act in the reauthorization of the FISA Section 702 surveillance policy. While Congress debates adding reforms to FISA Section 702 that would curtail the sale of Americans’ private, sensitive digital information to federal agencies, the Federal Trade Commission is already cracking down on companies that sell data, including their sales of “location data to government contractors for national security purposes.”
The FTC’s words follow serious action. In January, the FTC announced proposed settlements with two data aggregators, X-Mode Social and InMarket, for collecting consumers’ precise location data scraped from mobile apps. X-Mode, which can assimilate 10 billion location data points and link them to timestamps and unique persistent identifiers, was targeted by the FTC for selling location data to private government contractors without consumers’ consent. In February, the FTC announced a proposed settlement with Avast, a security software company, that sold “consumers’ granular and re-identifiable browsing information” embedded in Avast’s antivirus software and browsing extensions. What is the legal basis for the FTC’s action? The agency seems to be relying on Section 5 of the Federal Trade Commission Act, which grants the FTC power to investigate and prevent deceptive trade practices. In the case of X-Mode, the FTC’s proposed complaint highlight’s X-Mode’s statement that their location data would be used solely for “ad personalization and location-based analytics.” The FTC alleges X-Mode failed to inform consumers that X-Mode “also sold their location data to government contractors for national security purposes.” The FTC’s evolving doctrine seems even more expansive, weighing the stated purpose of data collection and handling against its actual use. In a recent blog, the FTC declares: “Helping people prepare their taxes does not mean tax preparation services can use a person’s information to advertise, sell, or promote products or services. Similarly, offering people a flashlight app does not mean app developers can collect, use, store, and share people’s precise geolocation information. The law and the FTC have long recognized that a need to handle a person’s information to provide them a requested product or service does not mean that companies are free to collect, keep, use, or share that’s person’s information for any other purpose – like marketing, profiling, or background screening.” What is at stake for consumers? “Browsing and location data paint an intimate picture of a person’s life, including their religious affiliations, health and medical conditions, financial status, and sexual orientation.” If these cases go to court, the tech industry will argue that consumers don’t sign away rights to their private information when they sign up for tax preparation – but we all do that routinely when we accept the terms and conditions of our apps and favorite social media platforms. The FTC’s logic points to the common understanding that our data is collected for the purpose of selling us an ad, not handing over our private information to the FBI, IRS, and other federal agencies. The FTC is edging into the arena of the Fourth Amendment Is Not for Sale Act, which targets government purchases and warrantless inspection of Americans’ personal data. The FTC’s complaints are, for the moment, based on legal theory untested by courts. If Congress attaches similar reforms to the reauthorization of FISA Section 702, it would be a clear and hard to reverse protection of Americans’ privacy and constitutional rights. Ken Blackwell, former ambassador and mayor of Cincinnati, has a conservative resume second to none. He is now a senior fellow of the Family Research Council and chairman of the Conservative Action Project, which organizes elected conservative leaders to act in unison on common goals. So when Blackwell writes an open letter in Breitbart to Speaker Mike Johnson warning him not to try to reauthorize FISA Section 702 in a spending bill – which would terminate all debate about reforms to this surveillance authority – you can be sure that Blackwell was heard.
“The number of FISA searches has skyrocketed with literally hundreds of thousands of warrantless searches per year – many of which involve Americans,” Blackwell wrote. “Even one abuse of a citizen’s constitutional rights must not be tolerated. When that number climbs into the thousands, Congress must step in.” What makes Blackwell’s appeal to Speaker Johnson unique is he went beyond including the reform efforts from conservative stalwarts such as House Judiciary Committee Chairman Jim Jordan and Rep. Andy Biggs of the Freedom Caucus. Blackwell also cited the support from the committee’s Ranking Member, Rep. Jerry Nadler, and Rep. Pramila Jayapal, who heads the House Progressive Caucus. Blackwell wrote: “Liberal groups like the ACLU support reforming FISA, joining forces with conservatives civil rights groups. This reflects a consensus almost unseen on so many other important issues of our day. Speaker Johnson needs to take note of that as he faces pressure from some in the intelligence community and their overseers in Congress, who are calling for reauthorizing this controversial law without major reforms and putting that reauthorization in one of the spending bills that will work its way through Congress this month.” That is sound advice for all Congressional leaders on Section 702, whichever side of the aisle they are on. In December, members of this left-right coalition joined together to pass reform measures out of the House Judiciary Committee by an overwhelming margin of 35 to 2. This reform coalition is wide-ranging, its commitment is deep, and it is not going to allow a legislative maneuver to deny Members their right to a debate. U.S. Treasury and FBI Targeted Americans for Political BeliefsThe House Judiciary Committee and its Select Subcommittee on the Weaponization of the Federal Government issued a report on Wednesday revealing secretive efforts between federal agencies and U.S. private financial institutions that “show a pattern of financial surveillance aimed at millions of Americans who hold conservative viewpoints or simply express their Second Amendment rights.”
At the heart of this conspiracy is the U.S. Treasury Department’s Financial Crimes Enforcement Network (FinCEN) and the FBI, which oversaw secret investigations with the help of the largest U.S. banks and financial institutions. They did not lack for resources. Law enforcement and private financial institutions shared customers’ confidential information through a web portal that connects the federal government to 650 companies that comprise two-thirds of the U.S. domestic product and 35 million employees. This dragnet investigation grew out of the aftermath of the Jan. 6 riot in the U.S. Capitol, but it quickly widened to target the financial transactions of anyone suspiciously MAGA or conservative. Last year we reported on how the Bank of America volunteered the personal information of any customer who used an ATM card in the Washington, D.C., area around the time of the riot. In this newly revealed effort, the FBI asked financial services companies to sweep their database to look for digital transactions with keywords like “MAGA” and “Trump.” FinCEN also advised companies how to use Merchant Category Codes (MCC) to search through transactions to detect potential “extremists.” Keywords attached to suspicious transactions included recreational stores Cabela’s, Bass Pro Shop, and Dick’s Sporting Goods. The committee observed: “Americans doing nothing other than shopping or exercising their Second Amendment rights were being tracked by financial institutions and federal law enforcement.” FinCEN also targeted conservative organizations like the Alliance Defending Freedom or the Eagle Forum for being demonized by a left-leaning organization, the Institute for Strategic Dialogue in London, as “hate groups.” The committee report added: “FinCEN’s incursion into the crowdfunding space represents a trend in the wrong direction and a threat to American civil liberties.” One doesn’t have to condone the breaching of the Capitol and attacks on Capitol police to see the threat of a dragnet approach that lacked even a nod to the concept of individualized probable cause. What was done by the federal government to millions of ordinary American conservatives could also be done to millions of liberals for using terms like “racial justice” in the aftermath of the riots that occurred after the murder of George Floyd. These dragnets are general warrants, exactly the kind of sweeping, indiscriminate violations of privacy that prompted this nation’s founders to enact the Fourth Amendment. If government agencies cannot satisfy the low hurdle of probable cause in an application for a warrant, they are apt to be making things up or employing scare tactics. If left uncorrected, financial dragnets like these will support a default rule in which every citizen is automatically a suspect, especially if the government doesn’t like your politics. The growth of the surveillance state in Washington, D.C., is coinciding with a renewed determination by federal agencies to expose journalists’ notes and sources. Recent events show how our Fourth Amendment right against unreasonable searches and seizures and our First Amendment right of a free press are inextricable and mutually reinforcing – that if you degrade one of these rights, you threaten both of them.
In May, the FBI raided the home of journalist Tim Burke, seizing his computer, hard drives, and cellphone, after he reported on embarrassing outtakes of a Fox News interview. It turns out these outtakes had already been posted online. Warrants were obtained, but on what credible allegation of probable cause? Or consider CBS News senior correspondent Catherine Herridge who was laid off, then days later ordered by a federal judge to reveal the identity of a confidential source she used for a series of 2017 stories published while she worked at Fox News. Shortly afterwards, Herridge was held in contempt for refusing to divulge that source. This raises the question that when CBS had earlier terminated Herridge and seized her files, would network executives have been willing to put their freedom on the line as Herridge has done? In response to public outcry, CBS relented and handed Herridge’s notes back to her. But local journalists cannot count on generating the national attention and sympathy that a celebrity journalist can. Now add to this vulnerability the reality that every American who is online – whether a national correspondent or a college student – has his or her sensitive and personal information sold to more than a dozen federal agencies by data brokers, a $250 billion industry that markets our data in the shadows. The sellers of our privacy compile nearly limitless data dossiers that “reveal the most intimate details of our lives, our movements, habits, associations, health conditions, and ideologies.” Data brokers have established a sophisticated system to aggregate data from nearly every platform and device that records personal information to develop detailed profiles on individuals. To fill in the blanks, they also sweep up information from public records. So if you have a smartphone, apps, or search online, your life is already an open book to the government. In this way, state and federal intelligence and law enforcement agencies can use the data broker loophole to obtain information about Americans that they would otherwise need a warrant, court order, or subpoena to obtain. Now imagine what might happen as these two trends converge – a government hungry to expose journalists’ sources, but one that also has access to a journalist’s location history, as well as everyone they have called, texted, and emailed. It is hardly paranoid, then, to worry that when a prosecutor tries to compel a journalist to give up a source through legal means, purchased data may have already given the government a road map on what to seek. The combined threat to privacy from pervasive surveillance and prosecutors seeking journalists’ notes is serious and growing. This is why PPSA supports legislation to protect journalistic privacy and close the data broker loophole. The Protect Reporters from Exploitive State Spying, or PRESS Act, would grant a privilege to protect confidential news sources in federal legal proceedings, while offering reasonable exceptions for extreme situations. Such “shield laws” have been put into place in 49 states. The PRESS Act, which passed the House in January with unanimous, bipartisan support, would bring the federal government in line with the states. Likewise, the Fourth Amendment Is Not For Sale Act would close the data broker loophole and require the government to obtain a warrant before it can seize our personal information, as required by the Fourth Amendment of the U.S. Constitution. The House Judiciary Committee voted to advance the Fourth Amendment Is Not For Sale Act out of committee with strong bipartisan support in July. The Judiciary Committee also reported out a strong data broker loophole closure as part of the Protect Liberty Act in December. Now, it’s up to Congress to include these protection and reform measures in the reauthorization of Section 702. PPSA urges lawmakers to pass measures to protect privacy and a free press. They will rise or fall together. The Biden Administration has placed the people, the industry, and the national security of the United States on the edge of a cyber cliff and is threatening to push us all off.
Does that sound alarmist? Consider: Wikipedia brings together thousands of volunteers to curate a free, online encyclopedia about – well, everything – including the policies and personalities of repressive, homicidal regimes from Russia, to China, to North Korea. In the last decade, the Wikimedia Foundation, the non-profit that hosts Wikipedia, has received increasing requests to provide user data to governments and wealthy individuals. These foreign appeals not only seek to bowdlerize accurate information and censor editorial content, they also ask for personal data to enable retaliation against the volunteers who edit Wikipedia. On one level, this is actually kind of funny. Dictators and cartel bosses who rule by terror at home are reduced to making polite requests to the Wikimedia Foundation because the current system denies them local access to Wikipedia data. The architecture of an open internet, which forbids forced data localization, thus throws up roadblocks for malevolent foreign interests that would access Americans’ online, personal information. Now Americans’ privacy and the security of U.S. data is completely at risk because of U.S. Trade Representative Katherine Tai’s astonishing withdrawal of support for the underpinnings of a global internet before the World Trade Organization. Tai’s move leaves the Biden Administration moving in opposite directions at once. With one hand, the Biden Administration recently issued an executive order cracking down on the sale of Americans’ personal data by data brokers to foreign “countries of concern.” With the other hand – the president’s trade representative – the U.S. offered to drop its long-standing opposition to forced data localization and to forced transfers of American tech companies’ algorithms to governments around the world. Tai would hand the keys to America’s digital kingdom to more than 80 countries, including China. It is not only Americans who will be at risk, but political dissidents and religious minorities around the world. “Growing requirements for data localization are happening alongside a global crackdown on free expression,” wrote the American Civil Liberties Union, the Center for Democracy & Technology, Freedom House, Information Technology and Innovation Foundation, Internet Society, PEN America, and the Wikimedia Foundation. “And people’s personal data – which can reveal who they voted for, who they worship, and who they love – can help facilitate this … 78 percent of the world’s internet users live in countries where simply expressing political, social, and religious viewpoints leads to legal repercussions.” The Biden Administration’s forced disclosure of source codes will undermine the national and personal security of our country. Why? And for what? We are not sure, but it is clear that it would put all Americans’ privacy and personal security at risk. David Pierce has an insightful piece in The Verge demonstrating the latest example of why every improvement in online technology leads to a yet another privacy disaster.
He writes about an experiment by OpenAI to make ChatGPT “feel a little more personal and a little smarter.” The company is now allowing some users to add memory to personalize this AI chatbot. Result? Pierce writes that “the idea of ChatGPT ‘knowing’ users is both cool and creepy.” OpenAI says it will allow users to remain in control of ChatGPT’s memory and be able to tell it to remove something it knows about you. It won’t remember sensitive topics like your health issues. And it has a temporary chat mode without memory. Credit goes to OpenAI for anticipating the privacy implications of a new technology, rather than blundering ahead like so many other technologists to see what breaks. OpenAI’s personal memory experiment is just another sign of how intimate technology is becoming. The ultimate example of online AI intimacy is, of course, the so-called “AI girlfriend or boyfriend” – the artificial romantic partner. Jen Caltrider of Mozilla’s Privacy Not Included team told Wired that romantic chatbots, some owned by companies that can’t be located, “push you toward role-playing, a lot of sex, a lot of intimacy, a lot of sharing.” When researchers tested the app, they found it “sent out 24,354 ad trackers within one minute of use.” We would add that data from these ads could be sold to the FBI, the IRS, or perhaps a foreign government. The first wave of people whose lives will be ruined by AI chatbots will be the lonely and the vulnerable. It is only a matter of time before sophisticated chatbots become ubiquitous sidekicks, as portrayed in so much near-term science fiction. It will soon become all too easy to trust a friendly and helpful voice, without realizing the many eyes and ears behind it. |
Categories
All
|