The reauthorization of FISA Section 702, which allows federal agencies to conduct international surveillance for national security purposes, has languished in Congress like an old Spanish galleon caught in the doldrums. This happened after opponents of reform pulled Section 702 reauthorization from the House floor rather than risk losing votes on popular measures, such as requiring government agencies to obtain warrants before surveilling Americans’ communications.
But the winds are no longer becalmed. They are picking up – and coming from the direction of reform. Sen. Dick Durbin (D-IL), Chairman of the Senate Judiciary Committee, and fellow committee member Sen. Mike Lee (R-UT), today introduced the Security and Freedom Enhancement (SAFE) Act. This bill requires the government to obtain warrants or court orders before federal agencies can access Americans’ personal information, whether from Section 702-authorized programs or purchased from data brokers. Enacted by Congress to enable surveillance of foreign targets in foreign lands, Section 702 is used by the FBI and other federal agencies to justify domestic spying. According to the Foreign Intelligence Surveillance Act (FISA) Court, under Section 702 government “batch” searches have included a sitting U.S. Congressman, a U.S. Senator, journalists, political commentators, a state senator, and a state judge who reported civil right violations by a local police chief to the FBI. It has even been used by government agents to stalk online romantic prospects. Millions of Americans in recent years have had their communications compromised by programs under Section 702. The reforms of the SAFE Act promise to reverse this trend, protecting Americans’ privacy and constitutional rights from the government. The SAFE Act requires:
Durbin-Lee is a pragmatic bill. It lifts warrants and other requirements in emergency circumstances. The SAFE Act allows the government to obtain consent for surveillance if the subject of the search is a potential victim or target of a foreign plot. It allows queries designed to identify targets of cyberattacks, where the only content accessed and reviewed is malicious software or cybersecurity threat signatures. The SAFE Act is a good-faith effort to strike a balance between national security and Americans’ privacy. It should break the current stalemate, renewing the push for debate and votes on amendments to the reauthorization of Section 702. How to Tell if You are Being Tracked Car companies are collecting massive amounts of data about your driving – how fast you accelerate, how hard you brake, and any time you speed. These data are then analyzed by LexisNexis or another data broker to be parsed and sold to insurance companies. As a result, many drivers with clean records are surprised with sudden, large increases in their car insurance payments.
Kashmir Hill of The New York Times reports the case of a Seattle man whose insurance rates skyrocketed, only to discover that this was the result of LexisNexis compiling hundreds of pages on his driving habits. This is yet another feature of the dark side of the internet of things, the always-on, connected world we live in. For drivers, internet-enabled services like navigation, roadside assistance, and car apps are also 24-7 spies on our driving habits. We consent to this, Hill reports, “in fine print and murky privacy policies that few read.” One researcher at Mozilla told Hill that it is “impossible for consumers to try and understand” policies chocked full of legalese. The good news is that technology can make data gathering on our driving habits as transparent as we are to car and insurance companies. Hill advises:
What you cannot do, however, is file a report with the FBI, IRS, the Department of Homeland Security, or the Pentagon to see if government agencies are also purchasing your private driving data. Given that these federal agencies purchase nearly every electron of our personal data, scraped from apps and sold by data brokers, they may well have at their fingertips the ability to know what kind of driver you are. Unlike the private snoops, these federal agencies are also collecting your location histories, where you go, and by inference, who you meet for personal, religious, political, or other reasons. All this information about us can be accessed and reviewed at will by our government, no warrant needed. That is all the more reason to support the inclusion of the principles of the Fourth Amendment Is Not for Sale Act in the reauthorization of the FISA Section 702 surveillance policy. While Congress debates adding reforms to FISA Section 702 that would curtail the sale of Americans’ private, sensitive digital information to federal agencies, the Federal Trade Commission is already cracking down on companies that sell data, including their sales of “location data to government contractors for national security purposes.”
The FTC’s words follow serious action. In January, the FTC announced proposed settlements with two data aggregators, X-Mode Social and InMarket, for collecting consumers’ precise location data scraped from mobile apps. X-Mode, which can assimilate 10 billion location data points and link them to timestamps and unique persistent identifiers, was targeted by the FTC for selling location data to private government contractors without consumers’ consent. In February, the FTC announced a proposed settlement with Avast, a security software company, that sold “consumers’ granular and re-identifiable browsing information” embedded in Avast’s antivirus software and browsing extensions. What is the legal basis for the FTC’s action? The agency seems to be relying on Section 5 of the Federal Trade Commission Act, which grants the FTC power to investigate and prevent deceptive trade practices. In the case of X-Mode, the FTC’s proposed complaint highlight’s X-Mode’s statement that their location data would be used solely for “ad personalization and location-based analytics.” The FTC alleges X-Mode failed to inform consumers that X-Mode “also sold their location data to government contractors for national security purposes.” The FTC’s evolving doctrine seems even more expansive, weighing the stated purpose of data collection and handling against its actual use. In a recent blog, the FTC declares: “Helping people prepare their taxes does not mean tax preparation services can use a person’s information to advertise, sell, or promote products or services. Similarly, offering people a flashlight app does not mean app developers can collect, use, store, and share people’s precise geolocation information. The law and the FTC have long recognized that a need to handle a person’s information to provide them a requested product or service does not mean that companies are free to collect, keep, use, or share that’s person’s information for any other purpose – like marketing, profiling, or background screening.” What is at stake for consumers? “Browsing and location data paint an intimate picture of a person’s life, including their religious affiliations, health and medical conditions, financial status, and sexual orientation.” If these cases go to court, the tech industry will argue that consumers don’t sign away rights to their private information when they sign up for tax preparation – but we all do that routinely when we accept the terms and conditions of our apps and favorite social media platforms. The FTC’s logic points to the common understanding that our data is collected for the purpose of selling us an ad, not handing over our private information to the FBI, IRS, and other federal agencies. The FTC is edging into the arena of the Fourth Amendment Is Not for Sale Act, which targets government purchases and warrantless inspection of Americans’ personal data. The FTC’s complaints are, for the moment, based on legal theory untested by courts. If Congress attaches similar reforms to the reauthorization of FISA Section 702, it would be a clear and hard to reverse protection of Americans’ privacy and constitutional rights. Ken Blackwell, former ambassador and mayor of Cincinnati, has a conservative resume second to none. He is now a senior fellow of the Family Research Council and chairman of the Conservative Action Project, which organizes elected conservative leaders to act in unison on common goals. So when Blackwell writes an open letter in Breitbart to Speaker Mike Johnson warning him not to try to reauthorize FISA Section 702 in a spending bill – which would terminate all debate about reforms to this surveillance authority – you can be sure that Blackwell was heard.
“The number of FISA searches has skyrocketed with literally hundreds of thousands of warrantless searches per year – many of which involve Americans,” Blackwell wrote. “Even one abuse of a citizen’s constitutional rights must not be tolerated. When that number climbs into the thousands, Congress must step in.” What makes Blackwell’s appeal to Speaker Johnson unique is he went beyond including the reform efforts from conservative stalwarts such as House Judiciary Committee Chairman Jim Jordan and Rep. Andy Biggs of the Freedom Caucus. Blackwell also cited the support from the committee’s Ranking Member, Rep. Jerry Nadler, and Rep. Pramila Jayapal, who heads the House Progressive Caucus. Blackwell wrote: “Liberal groups like the ACLU support reforming FISA, joining forces with conservatives civil rights groups. This reflects a consensus almost unseen on so many other important issues of our day. Speaker Johnson needs to take note of that as he faces pressure from some in the intelligence community and their overseers in Congress, who are calling for reauthorizing this controversial law without major reforms and putting that reauthorization in one of the spending bills that will work its way through Congress this month.” That is sound advice for all Congressional leaders on Section 702, whichever side of the aisle they are on. In December, members of this left-right coalition joined together to pass reform measures out of the House Judiciary Committee by an overwhelming margin of 35 to 2. This reform coalition is wide-ranging, its commitment is deep, and it is not going to allow a legislative maneuver to deny Members their right to a debate. U.S. Treasury and FBI Targeted Americans for Political BeliefsThe House Judiciary Committee and its Select Subcommittee on the Weaponization of the Federal Government issued a report on Wednesday revealing secretive efforts between federal agencies and U.S. private financial institutions that “show a pattern of financial surveillance aimed at millions of Americans who hold conservative viewpoints or simply express their Second Amendment rights.”
At the heart of this conspiracy is the U.S. Treasury Department’s Financial Crimes Enforcement Network (FinCEN) and the FBI, which oversaw secret investigations with the help of the largest U.S. banks and financial institutions. They did not lack for resources. Law enforcement and private financial institutions shared customers’ confidential information through a web portal that connects the federal government to 650 companies that comprise two-thirds of the U.S. domestic product and 35 million employees. This dragnet investigation grew out of the aftermath of the Jan. 6 riot in the U.S. Capitol, but it quickly widened to target the financial transactions of anyone suspiciously MAGA or conservative. Last year we reported on how the Bank of America volunteered the personal information of any customer who used an ATM card in the Washington, D.C., area around the time of the riot. In this newly revealed effort, the FBI asked financial services companies to sweep their database to look for digital transactions with keywords like “MAGA” and “Trump.” FinCEN also advised companies how to use Merchant Category Codes (MCC) to search through transactions to detect potential “extremists.” Keywords attached to suspicious transactions included recreational stores Cabela’s, Bass Pro Shop, and Dick’s Sporting Goods. The committee observed: “Americans doing nothing other than shopping or exercising their Second Amendment rights were being tracked by financial institutions and federal law enforcement.” FinCEN also targeted conservative organizations like the Alliance Defending Freedom or the Eagle Forum for being demonized by a left-leaning organization, the Institute for Strategic Dialogue in London, as “hate groups.” The committee report added: “FinCEN’s incursion into the crowdfunding space represents a trend in the wrong direction and a threat to American civil liberties.” One doesn’t have to condone the breaching of the Capitol and attacks on Capitol police to see the threat of a dragnet approach that lacked even a nod to the concept of individualized probable cause. What was done by the federal government to millions of ordinary American conservatives could also be done to millions of liberals for using terms like “racial justice” in the aftermath of the riots that occurred after the murder of George Floyd. These dragnets are general warrants, exactly the kind of sweeping, indiscriminate violations of privacy that prompted this nation’s founders to enact the Fourth Amendment. If government agencies cannot satisfy the low hurdle of probable cause in an application for a warrant, they are apt to be making things up or employing scare tactics. If left uncorrected, financial dragnets like these will support a default rule in which every citizen is automatically a suspect, especially if the government doesn’t like your politics. When we covered a Michigan couple suing their local government for sending a drone over their property to prove a zoning violation, we asked if there are any legal limits to aerial surveillance of your backyard.
In this case before the Michigan Supreme Court, Maxon v. Long Lake Township, counsel for the local government said that the right to inspect our homes goes all the way to space. He described the imaging capability of Google Earth satellites, asking: “If you want to know whether it’s 50 feet from this house to this barn, or 100 feet from this house to this barn, you do that right on the Google satellite imagery. And so given the reality of the world we live in, how can there be a reasonable expectation of privacy in aerial observations of property?” One justice reacted to the assertion that if Google Earth could map a backyard as closely and intimately as a drone, that would be a search. “Technology is rapidly changing,” the justice responded. “I don’t think it is hard to predict that eventually Google Earth will have that capacity.” Now William J. Broad of The New York Times reports that we’re well beyond Google Earth’s imaging of barns and houses. Try dinner plates and forks. Albedo Space of Denver is making a fleet of 24 small, low orbit satellites that will use imagery to guide responders in disasters, such as wildfires and other public emergencies. It will improve the current commercial standard of satellite imaging from a focal length of about a foot to about four inches. A former CIA official with decades of satellite experience told Broad that it will be a “big deal” when people realize that anything they are trying to hide in their backyards will be visible. Skinny-dipping in the pool will only be for the supremely confident. To his credit, Albedo chief Topher Haddad said, “we’re acutely aware of the privacy implications,” promising that management will be selective in their choice of clients on a case-by-case basis. It is good to know that Albedo likely won’t be using its technology to catch zone violators or backyard sunbathers. We’ve seen, however, that what is cutting-edge technology today will be standard tomorrow. This is just one more way in which the velocity of technology is outpacing our ability to adjust. There is, of course, one effective response. We can reject the Michigan town’s counsel argument who said, essentially, that privacy’s dead and we should just get over it. Courts and Congress should define orbital and aerial surveillance as searches requiring a probable cause warrant, as defined by the Fourth Amendment of the U.S. Constitution, before our homes and backyards can be invaded by eyes from above. The greatest danger to privacy is not that Albedo will allow government snoops to watch us in real time. The real threat is a satellite company’s ability to collect private images by the tens of millions. Such a database could then be sold to the government just as so much commercial digital information is now being sold to the government by data brokers. This is all the more reason for Congress to import the privacy-protecting warrant provisions of the Fourth Amendment Is Not For Sale Act into the reauthorization of FISA Section 702. In the last century, the surveillance state was held back by the fact that there could never be enough people to watch everybody. Whether Orwell’s fictional telescreens or the East German Stasi’s apparatus of civilian informants, there could simply never be enough watchers to follow every dissident, while having even more people to put all the watcher’s information together (although the Stasi’s elaborate filing system came as close as humanly possible to omniscience).
Now, of course, AI can do the donkey work. It can decide when a face, or a voice, a word, or a movement, is significant and flag it for a human intelligence officer. AI can weave data from a thousand sources – cell-site simulators, drones, CCTV, purchased digital data, and more – and thereby transform data into information, and information into actionable intelligence. The human and institutional groundwork is already in place to feed AI with intelligence from local, national, and global sources in more than 80 “fusion centers” around the country. These are sites where the National Counterterrorism Center coordinate intelligence from the 17 federal intelligence agencies with local and state law enforcement. FBI, NSA, and Department of Homeland Security intelligence networks get mixed in with intelligence from the locals. If you’ve ever reported something suspicious to the “if you see something, say something” ads, a fusion center is where your report goes. With terrorists and foreign threats ever present, it makes sense to share intelligence between agencies, both national and local. But absent clear laws and constitutional limits, we are also building the basics of a full-fledged surveillance state. With no warrant requirements currently in place for federal agencies to inspect Americans’ purchased digital data, there is nothing to stop the fusion of global, national, and local intelligence from a thousand sources into one ever-watchful eye. Step by step, day by day, new technologies, commercial entities and government agencies add new sources and capabilities to this ever-present surveillance. The latest thread in this weave comes from Axon, the maker of Tasers and body cameras for police. Axon has just acquired Fusus, which grants access to the camera networks of shopping centers, hospitals, residential communities, houses of worship, schools, and urban environments for more than 250 police “real-time crime centers.” Weave that data with fusion centers, and voilà, you are living in a Panopticon – a realm where you are always seen and always heard. To make surveillance even more thorough, Axon’s body cameras are being sold to healthcare and retail facilities to be worn by employees. Be nice to your nurse. Such daily progress in the surveillance state provides all the more reason for the U.S. House in its debate over the reauthorization of FISA Section 702 to include a warrant requirement before the government can freely swim in this ocean of data – our personal information – without restraint. David Pierce has an insightful piece in The Verge demonstrating the latest example of why every improvement in online technology leads to a yet another privacy disaster.
He writes about an experiment by OpenAI to make ChatGPT “feel a little more personal and a little smarter.” The company is now allowing some users to add memory to personalize this AI chatbot. Result? Pierce writes that “the idea of ChatGPT ‘knowing’ users is both cool and creepy.” OpenAI says it will allow users to remain in control of ChatGPT’s memory and be able to tell it to remove something it knows about you. It won’t remember sensitive topics like your health issues. And it has a temporary chat mode without memory. Credit goes to OpenAI for anticipating the privacy implications of a new technology, rather than blundering ahead like so many other technologists to see what breaks. OpenAI’s personal memory experiment is just another sign of how intimate technology is becoming. The ultimate example of online AI intimacy is, of course, the so-called “AI girlfriend or boyfriend” – the artificial romantic partner. Jen Caltrider of Mozilla’s Privacy Not Included team told Wired that romantic chatbots, some owned by companies that can’t be located, “push you toward role-playing, a lot of sex, a lot of intimacy, a lot of sharing.” When researchers tested the app, they found it “sent out 24,354 ad trackers within one minute of use.” We would add that data from these ads could be sold to the FBI, the IRS, or perhaps a foreign government. The first wave of people whose lives will be ruined by AI chatbots will be the lonely and the vulnerable. It is only a matter of time before sophisticated chatbots become ubiquitous sidekicks, as portrayed in so much near-term science fiction. It will soon become all too easy to trust a friendly and helpful voice, without realizing the many eyes and ears behind it. Just in time for the Section 702 debate, Emile Ayoub and Elizabeth Goitein of the Brennan Center for Justice have written a concise and easy to understand primer on what the data broker loophole is about, why it is so important, and what Congress can do about it.
These authors note that in this age of “surveillance capitalism” – with a $250 billion market for commercial online data – brokers are compiling “exhaustive dossiers” that “reveal the most intimate details of our lives, our movements, habits, associations, health conditions, and ideologies.” This happens because data brokers “pay app developers to install code that siphons users’ data, including location information. They use cookies or other web trackers to capture online activity. They scrape from information public-facing sites, including social media platforms, often in violation of those platforms’ terms of service. They also collect information from public records and purchase data from a wide range of companies that collect and maintain personal information, including app developers, internet service providers, car manufacturers, advertisers, utility companies, supermarkets, and other data brokers.” Armed with all this information, data brokers can easily “reidentify” individuals from supposedly “anonymized” data. This information is then sold to the FBI, IRS, the Drug Enforcement Administration, the Department of Defense, the Department of Homeland Security, and state and local law enforcement. Ayoub and Goitein examine how government lawyers employ legal sophistry to evade a U.S. Supreme Court ruling against the collection of location data, as well as the plain meaning of the U.S. Constitution, to access Americans’ most personal and sensitive information without a warrant. They describe the merits of the Fourth Amendment Is Not For Sale Act, and how it would shut down “illegitimately obtained information” from companies that scrape photos and data from social media platforms. The latter point is most important. Reformers in the House are working hard to amend FISA Section 702 with provisions from the Fourth Amendment Is Not For Sale Act, to require the government to obtain warrants before inspecting our commercially acquired data. While the push is on to require warrants for Americans’ data picked up along with international surveillance, the job will be decidedly incomplete if the government can get around the warrant requirement by simply buying our data. Ayoub and Goitein conclude that Congress must “prohibit government agencies from sidestepping the Fourth Amendment.” Read this paper and go here to call your House Member and let them know that you demand warrants before the government can access our sensitive, personal information. The word from Capitol Hill is that Speaker Mike Johnson is scheduling a likely House vote on the reauthorization of FISA’s Section 702 this week. We are told that proponents and opponents of surveillance reform will each have an opportunity to vote on amendments to this statute.
It is hard to overstate how important this upcoming vote is for our privacy and the protection of a free society under the law. The outcome may embed warrant requirements in this authority, or it may greatly expand the surveillance powers of the government over the American people. Section 702 enables the U.S. intelligence community to continue to keep a watchful eye on spies, terrorists, and other foreign threats to the American homeland. Every reasonable person wants that, which is why Congress enacted this authority to allow the government to surveil foreign threats in foreign lands. Section 702 authority was never intended to become what it has become: a way to conduct massive domestic surveillance of the American people. Government agencies – with the FBI in the lead – have used this powerful, invasive authority to exploit a backdoor search loophole for millions of warrantless searches of Americans’ data in recent years. In 2021, the secret Foreign Intelligence Surveillance Court revealed that such backdoor searches are used by the FBI to pursue purely domestic crimes. Since then, declassified court opinions and compliance reports reveal that the FBI used Section 702 to examine the data of a House Member, a U.S. Senator, a state judge, journalists, political commentators, 19,000 donors to a political campaign, and to conduct baseless searches of protesters on both the left and the right. NSA agents have used it to investigate prospective and possible romantic partners on dating apps. Any reauthorization of Section 702 must include warrants – with reasonable exceptions for emergency circumstances – before the data of Americans collected under Section 702 or any other search can be queried, as required by the U.S. Constitution. This warrant requirement must include the searching of commercially acquired information, as well as data from Americans’ communications incidentally caught up in the global communications net of Section 702. The FBI, IRS, Department of Homeland Security, the Pentagon, and other agencies routinely buy Americans’ most personal, sensitive information, scraped from our apps and sold to the government by data brokers. This practice is not authorized by any statute, or subject to any judicial review. Including a warrant requirement for commercially acquired information as well as Section 702 data is critical, otherwise the closing of the backdoor search loophole will merely be replaced by the data broker loophole. If the House declines to impose warrants for domestic surveillance, expect many politically targeted groups to have their privacy and constitutional rights compromised. We cannot miss the best chance we’ll have in a generation to protect the Constitution and what remains of Americans’ privacy. Copy and paste the message below and click here to find your U.S. Representative and deliver it: “Please stand up for my privacy and the Fourth Amendment to the U.S. Constitution: Vote to reform FISA’s Section 702 with warrant requirements, both for Section 702 data and for our sensitive, personal information sold to government agencies by data brokers.” Government Agencies Pose as Ad Bidders We’ve long reported on the government’s purchase of Americans’ sensitive and personal information scraped from our apps and sold to federal agencies by third-party data brokers. Closure of this data broker loophole is included in the House Judiciary Committee bill – the Protect Liberty and End Warrantless Surveillance Act – legislation that requires probable cause warrants before the federal government can inspect Americans’ data caught up in foreign intelligence under Section 702 of the Foreign Intelligence Surveillance Act. Of no less importance, the bipartisan Protect Liberty Act also requires warrants for inspection of the huge mass of Americans’ data sold to the government.
Thanks to Ben Lovejoy of the 9 to 5 Mac, we now know of the magnitude of the need for a legislative solution to this privacy vulnerability. Apple’s 2020 move to require app makers to notify you that you’re being tracked on your iPhone has been thoroughly undermined by a workaround through the technology of device fingerprinting. Add to that Patternz, a commercial spyware that extracts personal information from ads and push notifications so it can be sold. Patternz tracks up to 5 billion users a day, utterly defeating phone-makers’ attempts to protect consumer privacy. How does it work? 404 Media demonstrated that Patternz has deals with myriad small ad agencies to extract information from around 600,000 apps. In a now-deleted video, an affiliate of the company boasted that with this capability, it could track consumers’ locations and movements in real time. After this article was posted, Google acted against one such market participant, while Apple promises a response. But given the robustness of these tools, it is hard to believe that new corporate policies will be effective. That is because technology allows government agencies to pose as ad buyers to turn adware into a global tracking tool that federal agencies – and presumably the intelligence services of other governments – can access at will. Patternz can even install malware for more thorough and deeper penetration of customers’ phones and their sensitive information. It is almost as insidious as the zero-day malware Pegasus, transforming phones into 24/7 spy devices. Enter Patrick Eddington, senior fellow of the Cato Institute. He writes: “If you’re a prospective or current gun owner and you use your smartphone to go to OpticsPlanet to look for a new red dot sight, then go to Magpul for rail and sling adapters for the modern sporting rifle you’re thinking of buying, then mosey on over to LWRC to look at their latest gas piston AR-15 offerings, and finally end up at Ammunition Depot to check out their latest sale on 5.56mm NATO standard rounds, unless those retailers expressly offer you the option ‘Do not sell my personal data’ … all of your online browsing and ordering activity could end up being for sale to a federal law enforcement agency. “Or maybe even the National Security Agency.” The government’s commercial acquisition of Americans’ personal information from data sales contains troubling implications for both left and right – from abortion-rights activists concerned about women being tracked to clinics, to conservatives who care about the implications of this practice for the Second Amendment or free religious expression, to Americans of all stripes who don’t want our personal and political activities monitored in minute detail by the government. In January, the NSA admitted that it buys our personal information without a warrant. The investigative work performed by 404 Media and 9 to 5 Mac should give Members of Congress all the more reason to support the Protect Liberty Act. We recently celebrated the decision by Amazon to require police to present a warrant before going through the “Request for Assistance” tool to seek video footage from the neighborhood networks of Ring cameras owners. We touted this is as a significant victory for privacy.
And it is – it effectively neutralized more than 2,300 agreements Amazon had with local police and fire departments to help them obtain private security footage – but it wasn’t quite as big a deal as we first thought. Thanks to reporting from Baylee Bates of KCEN News in Temple, Texas, Amazon’s change is prompting a big yawn from police. Why? A spokeswoman for the Temple Police Department told Bates that officers had found greater success in requests for video footage by making door-to-door contacts. “We have found throughout the years of gathering the security footage that going to residents, business owners, that face-to-face interaction with people has been way more successful for us,” said Megan Price of the Temple PD. A spokesman for the Bell County Sheriff’s Department told Bates that Amazon’s policy change “doesn’t stop us from going individual-to-individual and talking, the way we prefer to do it anyway.” Since the introduction of Ring, customers have for the most part complied with police requests for videos. If someone set fire to a car in our neighborhood, or burgled a house across the street, we would do the same. But the eagerness of most people to grant police access to surveillance video is concerning, as more neighborhoods become “ringed” with surveillance. Three years ago, a Washington Post story quoted a mother in California telling her seven-year-old son, “Every time you ride your bike down this block, there are probably 50 cameras that watch you going past. If you make a bad choice, those cameras will catch you.” We wrote at the time that George Orwell never imagined millions of Ring cameras – and millions of users willing to hand over video when asked. The threat to privacy from neighborhood surveillance is as much about audio recordings as it is video. Sen. Edward Markey (D-MA) assessed the risk of a surveillance network in every neighborhood in a letter to Amazon in 2022: “This surveillance system threatens the public in ways that go far beyond abstract privacy invasion: individuals may use Ring devices’ audio recordings to facilitate blackmail, stalking, and other damaging practices. As Ring products capture significant amounts of audio on private and public property adjacent to dwellings with Ring doorbells – including recordings of conversations that people reasonably expect to be private – the public’s right to assemble, move, and converse without being tracked is at risk.” At least you can always step inside, away from the microphones and the cameras, settle into your chair, and let Alexa take over your surveillance. Cato Institute Senior Fellow Patrick Eddington filed a Freedom of Information Act request against the Department of Defense this week asking two questions.
First, what was the scope and duration of Pentagon aerial surveillance deployed over domestic protesters in the last year of the Trump Administration? Second, why is the Biden Administration shielding those records from the public? In the world of civil liberties critics of the American surveillance state, Patrick Eddington has the pedigree of highly credentialed practitioner. From 1988 to 1996, Eddington was a military imagery analyst at the CIA’s National Photographic Interpretation Center, where he was officially recognized many times for his work. He’s the real deal. So is former Rep. Adam Kinzinger (R-IL), who knows what he’s talking about when it comes to surveillance aircraft. Kinzinger flew the RC-26 surveillance aircraft in Iraq, which had a complement of sensors that records people and objects with video, in both visible and infrared frequencies. Eddington reports that from 2020 to 2021, the Pentagon used RC-26B turboprop planes to surveil American protesters. Many of these aircraft have been deployed widely across the country, from Alabama to Washington State. Kinzinger has written that the RC-26 “could fly fast and low, capturing the signals from thousands of cellphones … With the right coordination, the target could be reached in minutes, not hours.” The Air Force Inspector General reports that the digital data packages Kinzinger referred to were removed for the RC-26s deployed for domestic use. Even so, the sensor package of this fleet still represents an astonishing surveillance capability of Americans on the ground. Writing in The Orange County Register, Eddington asks if the Biden Administration’s stonewalling means it reserves the right to use the fleet against pro-Trump protesters? Or for a new Trump Administration in tracking anti-Trump protesters? Or for some future president against any political enemy? Behind the intense rancor in American partisan politics, at least one aspiration unites administrations of both parties – an intent to surveil. PPSA will report on further developments. The first deepfake of this long-anticipated “AI election” happened when a synthetic Joe Biden made robocalls to New Hampshire Democrats urging them not to vote in that presidential primary. “It’s important that you save your vote for the November election,” fake Biden told Democrats. Whoever crafted this trick expected voters to believe that a primary vote would somehow deplete a storehouse of general-election votes.
Around the same time, someone posted AI-generated fake sexual images of pop icon Taylor Swift, prompting calls for laws to curb and punish the use of this technology for harassment. Other artists are calling for protections not of their visage, but of their intellectual property, with paintings and photographs being expropriated as grist for AI’s mill. Members of Congress and state legislators are racing to pass laws to make such tricks and appropriations a crime. It certainly makes sense to criminalize the cheating of voters by making candidates appear to say and do things they would never say or do. But sweeping legislation also poses dangers to the First Amendment rights of Americans, including crackdowns on what is clearly satire – such as a clear joke image of a politician in the inset behind the “Weekend Update” anchors of Saturday Night Live. Such caution is needed as pressure for legislative action grows with the proliferation of deepfakes. Even among non-celebrities, this technology is used to create sexually abusive material, commit fraud, and harass individuals. According to Control AI, a group concerned about the current trajectory of artificial intelligence, such technology is now widely available. All someone needs to create a compelling deepfake is a photo of you or a short recording of your voice, which most of us have already very helpfully posted online. Control AI claims that an overwhelming 96 percent of deepfake videos are sexually abusive. And they are becoming more common – 13 times as many deepfakes were created in 2023 as in 2022. Meanwhile, only 42 percent of Americans even know what a deepfake is. The day is fast approaching when anyone can create a convincing fake sex tape of a political candidate, or a presidential candidate announcing the suspension of his campaign on the eve of an election, or a fake video of a military general declaring martial law. A few weeks ago, a convincing fake video of the Louvre museum in Paris on fire went viral, alarming people around the world. With two billion people poised to vote in major elections around the globe this year, deepfake technology is positioned to brew distrust and wreak some havoc. While the Biden campaign has the resources to quicky refute the endless stream of fake photos and videos, the average American does not. A fake sex tape of a work colleague could burn through the internet before she has a chance to refute it. An AI-generated voice recording could be used to commit fraud, while even a fake photo could do immense damage. And if you thought forcing AI to include a watermark in whatever it produces, think again. Control AI points out that it is simply impossible to create watermarks that cannot be removed easily by AI. Many strategies to stop deepfakes are about as effective as trying to keep kids off their parents’ computer. It is unrealistic to believe we can slow down the evolution of artificial intelligence, as Control AI proposes to do. Certainly America’s enemies can be counted on to use AI to their advantage. Putting AI behind a government lock and key stifles the massive innovation that AI promises to bring, gives a technological edge to Russia and China, while also giving sole use of the technology to the federal government. That, too, poses serious problems for surveillance and oversight. Given the First and Fourth Amendment implications, Congress should not act in haste. Congress should start the long and difficult conversation about how best to contain AI’s excesses, while best benefitting from its promise in human health and wealth creation. Congress should continue to hold hearings and investigate solutions. Meanwhile, the best guard against AI is a public that is already deeply skeptical of conventional information encountered online. As more Americans learn what a deepfake is, the less impact these images will have. We reported last week that the Biden Administration leaked the news that it is drafting an executive order to restrict “countries of concern” from acquiring Americans’ most sensitive and personal digital and DNA information.
At the top of the Administration’s concern is the likely acquisition by the People’s Republic of China of a vast databank of Americans’ DNA from Chinese-owned companies that perform genetic testing for U.S. healthcare. Should we care? A glimpse of the dangers of such tracking can be seen in how China uses mass DNA mapping of whole populations to track and persecute religious minorities. At a recent conference in Washington, D.C., on surveillance of religious minorities in China, we heard evidence – well documented by many journalists – that China is using facial recognition (with racial filters), car sensors, cell-site simulators, and location tracking to systematically surveil that country’s Uighur Muslim and Tibetan Buddhist minorities. A recent Human Rights Watch whitepaper details how China’s authorities are systematically collecting DNA in Tibet. The cover story for one such program for people aged 12 to 65 is that the government is performing a health-check program called Physicals For All – though patients are not allowed to learn the results of any of their tests. DNA testing in Tibet is, to paraphrase the Godfather, an offer that cannot be refused. With this data, the government can track people by ethnicity, and map their families by their genes and presumed beliefs. The most pernicious aspect of this program is the collection of children’s DNA, a unique identifier that will never change. Such genetic surveillance also necessarily connects a whole bloodline to one person suspected of religious dissidence – what Chinese police call “one household, one file.” Such files can be used to track people who lead worship services or advocate religious or secular views not approved by the government. This program of police-community relations is called “spreading information tentacles.” DNA is also used to identify (and presumably, from samples located at a given site or shrine) to track clerics and lamas, village elders, and others who might be engaged in meetings or conducting unofficial mediation of local disputes. Combined with electronic surveillance, authorities can detect forbidden material accessed by phones and other devices, and then turn to DNA mapping to break up social and religious organizations, keeping civil society atomized before the state. Thus, China’s DNA database is amplified by many forms of electronic surveillance, with artificial intelligence putting together patterns of association and blood relations for police. Researcher Adrian Zenz told The Bulletin of Atomic Scientists that in 2017 alone, China spent almost $350 billion on internal security outlays. According to the Biden Administration, China is also spending money to purchase Americans’ data. This could include medical, financial, occupational, familial, and romantic profiles: genetic surveillance provides another tile that forms a mosaic of the American population for China. Thus, an American child sequenced for a medical test today could have his or her genetic health profile and identity known to the Chinese state for life. As the Office of the Director of National Intelligence warns: “The loss of your DNA not only affects you, but your relatives and, potentially, generations to come.” Wired reports that police in northern California asked Parabon NanoLabs to run a DNA sample from a cold case murder scene to identify the culprit. Police have often run DNA against the vast database of genealogical tests, cracking cold cases like the Golden State Killer, who murdered at least 13 people.
But what Parabon NanoLabs did for the police in this case was something entirely different. The company produced a 3D rendering of a “predicted face” based on the genetic instructions encoded in the sample’s DNA. The police then ran it against facial recognition software to look for a match. Scientists are skeptical that this is an effective tool given that Parabon’s methods have not been peer-reviewed. Even the company’s director of bioinformatics, Ellen Greytak, told Wired that such face predictions are closer in accuracy to a witness description rather than the exact replica of a face. With the DNA being merely suggestive – Greytak jokes that “my phenotyping can tell you if your suspect has blue eyes, but my genealogist can tell you the guy’s address” – the potential for false positives is enormous. Police multiply that risk when they run a predicted face through the vast database of facial recognition technology (FRT) algorithms, technology that itself is far from perfect. Despite cautionary language from technology producers and instructions from police departments, many detectives persist in mistakenly believing that FRT returns matches. Instead, it produces possible candidate matches arranged in the order of a “similarity score.” FRT is also better with some types of faces than others. It is up to 100 times more likely to misidentify Asian and Black people than white men. The American Civil Liberties Union, in a thorough 35-page comment to the federal government on FRT, biometric technologies, and predictive algorithms, noted that defects in FRT are likely to multiply when police take a low-quality image and try to brighten it, or reduce pixelation, or otherwise enhance the image. We can only imagine the Frankenstein effect of mating a predicted face with FRT. As PPSA previously reported, rights are violated when police take a facial match not as a clue, but as evidence. This is what happened when Porcha Woodruff, a 32-year-old Black woman and nursing student in Detroit, was arrested on her doorstep while her children cried. Eight months pregnant, she was told by police that she had committed recent carjackings and robberies – even though the woman committing the crimes in the images was not visibly pregnant. Woodruff went into contractions while still in jail. In another case, local police executed a warrant by arresting a Georgia man at his home for a crime committed in Louisiana, even though the arrestee had never set foot in Louisiana. The only explanation for such arrests is sheer laziness, stupidity, or both on the part of the police. As ACLU documents, facial recognition forms warn detectives that a match “should only be considered an investigative lead. Further investigation is needed to confirm a match through other investigative corroborated information and/or evidence. INVESTIGATIVE LEAD, NOT PROBABLE CAUSE TO MAKE AN ARREST.” In the arrests made in Detroit and Georgia, police had not performed any of the rudimentary investigative steps that would have immediately revealed that the person they were investigating was innocent. Carjacking and violent robberies are not typically undertaken by women on the verge of giving birth. The potential for replicating error in the courtroom would be multiplied by showing a predicted face to an eyewitness. If a witness is shown a predicted face, that could easily influence the witness’s memory when presented with a line-up. We understand that an investigation might benefit from knowing that DNA reveals that a perp has blue eyes, allowing investigators to rule out all brown- and green-eyed suspects. But a predicted face should not be enough to search through a database of innocent people. In fact, any searches of facial recognition databases should require a warrant. As technology continues to push the boundaries, states need to develop clear procedural guidelines and warrant requirements that protect constituents’ constitutional rights. While Congress is locked in spirited debate over the limits of surveillance in America, large technology companies are responding to growing consumer concerns about privacy by reducing government’s warrantless access to data.
For years, police had a free hand in requesting from Google the location histories of groups of people in a given vicinity recorded on Google Maps. Last month, Google altered the Location History feature on Google Maps. For users who enable this feature to track where they’ve been, their location histories will now be saved on their smartphone or other devices, not on Google servers. As a result of this change, Google will be unable to respond to geofenced warrants. “Your location information is personal,” Google announced. “We’re committed to keeping it safe, private and in your control.” This week, Amazon followed Google’s lead by disabling its Request for Access tool, a feature that facilitated requests from law enforcement to ask Ring camera owners to give up video of goings on in the neighborhood. We reported three years ago that Amazon had cooperative agreements with more than 2,000 police and fire departments to solicit Ring videos for neighborhood surveillance from customers. By clicking off Request for Access, Amazon is now closing the channel for law enforcement to ask Ring customers to volunteer footage about their neighbors. PPSA commends Google and Amazon for taking these steps. But they wouldn’t have made these changes if consumers weren’t clamoring for a restoration of the expectation of privacy. These changes are a sure sign that the mounting complaints of civil liberties advocates are moving the needle of public opinion. Corporations are exquisitely attuned to consumer attitudes, and so they are listening and acting. In the wake of Thursday’s revelation that the National Security Agency is buying Americans’ location data, we urge Congress to show similar sensitivity. With polls showing that nearly four out of five Americans support strong surveillance reform, Congress should respond to public opinion by passing The Protect Liberty Act, which imposes a warrant requirement on all personal information purchased by government agencies. Late last year, Sen. Ron Wyden (D-OR) put a hold on the appointment of Lt. Gen. Timothy Haugh to replace outgoing National Security Agency director Gen. Paul Nakasone. Late Thursday, Sen. Wyden’s pressure campaign yielded a stark result – a frank admission from Gen. Nakasone that, as long suspected, the NSA purchases Americans’ sensitive, personal online activities from commercial data brokers.
The NSA admitted it buys netflow data, which records connections between computers and servers. Even without the revelation of messages’ contents, such tracking can be extremely personal. A Stanford University study of telephone metadata showed that a person’s calls and texts can reveal connections to sensitive life issues, from Alcoholics Anonymous to abortion clinics, gun stores, mental and health issues including sexually transmitted disease clinics, and connections to faith organizations. Gen. Nakasone’s letter to Sen. Wyden states that NSA works to minimize the collection of such information. He writes that NSA does not buy location information from phones inside the United States, or purchase the voluminous information collected by our increasingly data-hungry automobiles. It would be a mistake, however, to interpret NSA’s internal restrictions too broadly. While NSA is generally the source for signals intelligence for the other agencies, the FBI, IRS, and the Department of Homeland Security are known to make their own data purchases. In 2020, PPSA reported on the Pentagon purchasing data from Muslim dating and prayer apps. In 2021, Sen. Wyden revealed that the Defense Intelligence Agency was purchasing Americans’ location data from our smartphones without a warrant. How much data, and what kinds of data, are purchased by the FBI is not clear. Sen. Wyden did succeed in a hearing last March in prompting FBI Director Christopher Wray to admit that the FBI had, in some period in the recent past, purchased location data from Americans’ smartphones without a warrant. Despite a U.S. Supreme Court opinion, Carpenter (2018), which held that the U.S. Constitution requires a warrant for the government to compel telecom companies to turn over Americans’ location data, federal agencies maintain that the Carpenter standard does not curb their ability to purchase commercially available digital information. In a press statement, Sen. Wyden hammers home the point that a recent Federal Trade Commission order bans X-Mode Social, a data broker, and its successor company, from selling Americans’ location data to government contractors. Another data broker, InMarket Media, must notify customers before it can sell their precise location data to the government. We now have to ask: was Wednesday’s revelation that the Biden Administration is drafting rules to prevent the sale of Americans’ data to hostile foreign governments an attempt by the administration to partly get ahead of a breaking story? For Americans concerned about privacy, the stakes are high. “Geolocation data can reveal not just where a person lives and whom they spend time with but also, for example, which medical treatments they seek and where they worship,” FTC Chair Lina Khan said in a statement. “The FTC’s action against X-Mode makes clear that businesses do not have free license to market and sell Americans’ sensitive location data. By securing a first-ever ban on the use and sale of sensitive location data, the FTC is continuing its critical work to protect Americans from intrusive data brokers and unchecked corporate surveillance.” As Sen. Wyden’s persistent digging reveals more details about government data purchases, Members of Congress are finding all the more reason to pass the Protect Liberty Act, which enforces the Constitution’s Fourth Amendment warrant requirement when the government inspects Americans’ purchased data. This should also put Members of the Senate and House Intelligence Committees on the spot. They should explain to their colleagues and constituents why they’ve done nothing about government purchases of Americans’ data – and why their bills include exactly nothing to protect Americans’ privacy under the Fourth Amendment. More to come … Well, better late than never. Bloomberg reports that the Biden Administration is preparing new rules to direct the U.S. Attorney General and Department of Homeland Security to restrict data transactions that sells our personal information – and even our DNA – to “countries of concern.”
Consider that much of the U.S. healthcare system relies on Chinese companies to sequence patients’ genomes. Under Chinese law, such companies are required to share their data with the government. The Office of the Director of National Intelligence warns that “Losing your DNA is not like losing a credit card. You can order a new credit card, but you cannot replace your DNA. The loss of your DNA not only affects you, but your relatives and, potentially, generations to come.” The order is also expected to crack down on data broker sales that could facilitate espionage or blackmail of key individuals serving in the federal government; it could be used to panic or distract key personnel in the event of a crisis; and collection of data on politicians, journalists, academics, and activists could deepen the impact of influence campaigns across the country. PPSA welcomes the development of this Biden rule. We note, however, that just like China, our own government routinely purchases Americans’ most sensitive and personal information from data brokers. These two issues – foreign access to commercially acquired data, and the access to this same information by the FBI, IRS, Department of Homeland Security, and other agencies – are related but separate issues that need to be addressed separately, the latter in the legislative process. The administration’s position on data purchases is contradictory. The administration also opposes closing the data-broker loophole in the United States. In the Section 702 debate, Biden officials say we would be at a disadvantage against China and other hostile countries that could still purchase Americans’ data. This new Biden Administration effort undercuts its argument. We should not emulate China’s surveillance practices any more than we practice their crackdowns against freedom of speech, religion, and other liberties. Still, this proposed rule against foreign data purchases is a step in the right direction, in itself and for highlighting the dire need for legislation to restrict the U.S. government’s purchase of its own citizens’ data. The Protect Liberty Act, which passed by the House Judiciary Committee by an overwhelming 35-2 vote to reauthorize Section 702, closes this loophole at home just as the Biden Administration seeks to close it abroad. So when the new Biden rule is promulgated, it should serve as a reminder to Congress that we have a problem with privacy at home as well. No sooner did the Protect Liberty and End Warrantless Surveillance Act pass the House Judiciary Committee with overwhelming bipartisan support than the intelligence community began to circulate what Winston Churchill in 1906 politely called “terminological inexactitudes.”
The Protect Liberty Act is a balanced bill that respects the needs of national security while adding a warrant requirement whenever a federal agency inspects the data or communications of an American, as required by the Fourth Amendment. This did not stop defenders of the intelligence community from claiming late last year that Section 702 reforms would harm the ability of the U.S. government to fight fentanyl. This is remarkable, given that the government hasn’t cited a single instance in which warrantless searches of Americans’ communications proved useful in combating the fentanyl trade. Nothing in the bill would stop surveillance of factories in China or cartels in Mexico. If an American does become a suspect in this trafficking, the government can and should seek a probable cause warrant, as is routinely done in domestic law enforcement cases. No sooner did we bat that one away than we heard about fresh terminological inexactitudes. Here are two of the latest bits of disinformation being circulated on Capitol Hill about the Protect Liberty Act. Intelligence Community Myth: Members of Congress are being told that under the Protect Liberty Act, the FBI would be forced to seek warrants from district court judges, who might or might not have security clearances, in order to perform U.S. person queries. Fact: The Protect Liberty Act allows the FBI to conduct U.S. person queries if it has either a warrant from a regular federal court or a probable cause order from the FISA Court, where judges have high-level security clearances. The FBI will determine which type of court order is appropriate in each case. Intelligence Community Myth: Members are being told that under the Protect Liberty Act, terrorists can insulate themselves from surveillance by including a U.S. person in a conversation or email thread. Fact: Under the Protect Liberty Act, the FBI can collect any and all communications of a foreign target, including their communications with U.S. persons. Nothing in the bill prevents an FBI agent from reviewing U.S. person information the agent encounters in the course of reviewing the foreign target’s communications. In other words, if an FBI agent is reading a foreign target’s emails and comes across an email to or from a U.S. person, the FBI agent does not need a warrant to read that email. The bill’s warrant requirement applies in one circumstance only: when an FBI agent runs a query designed to retrieve a U.S. person’s communications or other Fourth Amendment-protected information. That is as it should be under the U.S. Constitution. As we face the renewed debate over Section 702 – which must be reauthorized in the next few months – expect the parade of untruths to continue. As they do, PPSA will be here to call them out. National Rifle Association v. Vullo In this age of “corporate social responsibility,” can a government regulator mount a pressure campaign to persuade businesses to blacklist unpopular speakers and organizations? Would such pressure campaigns force banks, cloud storage companies, and other third parties that hold targeted organizations’ data to compromise their clients’ Fourth as well as their First Amendment rights?
These are just some of the questions PPSA is asking the U.S. Supreme Court to weigh in National Rifle Association v. Vullo. Here's the background on this case: Maria Vullo, then-superintendent of the New York Department of Financial Services, used her regulatory clout over banks and insurance companies in New York to strongarm them into denying financial services to the National Rifle Association. This campaign was waged under an earnest-sounding directive to consider the “reputational risk” of doing business with the NRA and firearms manufacturers. Vullo imposed consent orders on three insurers that they never again provide policies to the NRA. She issued guidance that encouraged financial services firms to “sever ties” with the NRA and to “continue evaluating and managing their risks, including reputational risks” that could arise from their dealings with the NRA or similar gun promotion organizations. “When a regulator known to slap multi-million fines on companies issues ‘guidance,’ it is not taken as a suggestion,” said Gene Schaerr, PPSA general counsel. “It’s sounds more like, ‘nice store you’ve got here, it’d be shame if anything happened to it.’” The U.S. Court of Appeals for the Second Circuit reversed a lower court’s decision that found that Vullo used threats to force the companies she regulates to cut ties with the NRA. The Second Circuit reasoned that: “The general backlash against gun promotion groups and businesses … could (and likely does) directly affect the New York financial markets; as research shows, a business's response to social issues can directly affect its financial stability in this age of enhanced corporate social responsibility.” You don’t have to be an enthusiast of the National Rifle Association to see the problems with the Second Circuit’s reasoning. Aren’t executives of New York’s financial services firms better qualified to determine what does and doesn’t “directly affect financial stability” than a regulator in Albany? How aggressive will government become in using its almost unlimited access to buy or subpoena data of a target organization to get its way? We told the Court: “Even the stability of a single company is not enough; the government cannot override the Bill of Rights to slightly reduce the rate of corporate bankruptcies.” In our brief, PPSA informs the U.S. Supreme Court about the dangers of a nebulous, government-imposed “corporate social responsibility standard.” We write: “Using CSR – a controversial theory positing that taking popular or ‘socially responsible’ stances may increase corporate profits – to justify infringement of First Amendment rights poses a grave threat to all Constitutionally-protected individual rights.” PPSA is reminding the Court that the right to free speech and the right to be protected from government surveillance are intwined. “Once again, the House has passed the Protect Reporters from Exploitive State Spying (PRESS) Act with unanimous, bipartisan support. Forty-nine states have press shield laws protecting journalists and their sources from the prying eyes of prosecutors. The federal government does not. From Fox News to The New York Times, government has surveilled journalists in order to catch their sources. Journalists have been held in contempt and even jailed for bravely safeguarding the trust of their sources.
“The PRESS Act corrects this by granting a privilege to protect confidential news sources in federal legal proceedings, while offering reasonable exceptions for extreme situations. Such laws work well for the states and would safeguard Americans’ right to evaluate claims of secret wrongdoing for themselves. “Great credit goes to Rep. Kevin Kiley and Rep. Jamie Raskin for lining up bipartisan support for this reaffirmation of the First Amendment. As in 2022, the last time the House passed this act, the duty now shifts to the U.S. Senate to respond to this display of unanimous, bipartisan support. I am optimistic. At a time of gridlock, enacting this bill into law would be a positive message that would reflect well on every Senator.” CVS, Kroger, and Rite Aid Hand Over Americans’ Prescriptions Records to Police Upon Request1/17/2024
Three of the largest pharmaceutical chains – CVS Health, Kroger, and Rite Aid – routinely hand over the prescription and medical records of Americans to police and government agencies upon request, no warrant required.
“Americans' prescription records are among the most private information the government can obtain about a person,” Sen. Ron Wyden (D-OR), and Reps. Pramila Jayapal (D-WA) and Sara Jacobs (D-CA) wrote in a letter to HHS Secretary Xavier Becerra revealing the results of a congressional investigation into this practice. “They can reveal extremely personal and sensitive details about a person’s life, including prescriptions for birth control, depression or anxiety medications, or other private medical conditions.” The Washington Post reports that because the chains often share records across all locations, a pharmacy in one state can access a person’s medical history from states with more restrictive laws. Five pharmacies – Amazon, Cigna, Optum Rx, Walmart, and Walgreens Boots Alliance – require demands for pharmacy records by law enforcement to be reviewed by legal professionals. One of them, Amazon, informs consumers of the request unless hit with a gag order. All the major pharmacies will release customer records, however, if they are merely given a subpoena issued by a government agency rather than a warrant issued by a judge. This could be changed by corporate policy. Sen. Wyden and Reps. Jayapal and Jacobs urge pharmacies to insist on a warrant rather than comply with a request or a subpoena. Most Americans are familiar with the strict privacy provisions of the Health Insurance Portability and Accountability Act (HIPAA) from filling out forms in the doctor’s office. Most will surely be surprised how HIPAA, as strict as it is for physicians and hospitals, is wide open for warrantless inspection by the government. This privacy vulnerability is just one more example of the generous access government agencies have to almost all of our information. Intelligence and law enforcement agencies can know just about everything about us through purchases of our most sensitive and personal information reaped by our apps and sold to the government by data brokers. As privacy champions in Congress press HHS to revise its HIPAA regulations to protect Americans’ medical data from warrantless inspection, Congress should also close all the loopholes by passing the Protect Liberty and End Warrantless Surveillance Act. The Federal Reserve Board is publicly weighing whether or not to ask Congress to allow it to establish a Central Bank Digital Currency (CBDC), replacing paper dollars with government-issued electrons.
Given the growth of computing, a digital national currency may seem inevitable. But it would be a risky proposition from the standpoint of cybersecurity, national security, and unintended consequences for the economy. A CBDC would certainly pose a significant threat to Americans’ privacy. A factsheet on the Federal Reserve website says, “Any CBDC would need to strike an appropriate balance between safeguarding the privacy rights of consumers and affording the transparency necessary to deter criminal activity.” The Fed imagines that such a scheme would rely on privacy-sector intermediaries to create digital wallets and protect consumers’ privacy. Given the hunger that officialdom in Washington, D.C., has shown for pulling in all our financial information – including a serious proposal to record transactions from bank accounts, digital wallets, and apps – the Fed’s balancing of our privacy against surveillance of the currency is troubling. With digital money, government would have in its hands the ability to surveil all transactions, tracing every dollar from recipient to spender. Armed with such power, the government could debank any number of disfavored groups or individuals. If this sounds implausible, consider that debanking was exactly the strategy the Canadian government used against the trucker protestors two years ago. Enter H.R. 1122 – the CBDC Anti-Surveillance State Act – which sets down requirements for a digital currency. This bill would prohibit the Federal Reserve from using CBDC to implement monetary policy. It would require the Fed to report the results of a study or pilot program to Congress on a quarterly basis and consult with the brain trust of the Fed’s regional banks. Though this bill prevents the Fed from issuing CBDC accounts to individuals directly, there is a potential loophole in this bill – the Fed might still maintain CBDC accounts for corporations (the “intermediaries” the Fed refers to). The sponsors may want to close any loopholes there. That’s a quibble, however. This bill, sponsored by Rep. Tom Emmer (R-MN), Majority Whip of the House, with almost 80 co-sponsors, is a needed warning to the Fed and to surveillance hawks that a financial surveillance state is unacceptable. The American Civil Liberties Union, its Northern California chapter, and the Brennan Center, are calling on the Federal Trade Commission to investigate whether Meta and X have broken commitments they made to protect customers from data brokers and government surveillance.
This concern goes back to 2016 when it came to light that Facebook and Twitter helped police target Black Lives Matter activists. As a result of protests by the ACLU of Northern California and other advocacy groups, both companies promised to strengthen their anti-surveillance policies and cut off access to social media surveillance companies. Their privacy promises even became points of pride in these companies’ advertising. Now ACLU and Brennan say they have uncovered commercial documents from data brokers that seem to contradict these promises. They point to a host of data companies that publicly claim they have access to data from Meta and/or X, selling customers’ information to police and other government agencies. ACLU writes: “These materials suggest that law enforcement agencies are getting deep access to social media companies’ stores of data about people as they go about their daily lives.” While this case emerged from left-leaning organizations and concerns, organizations and people on the right have just as much reason for concern. The posts we make, what we say, who our friends are, can be very sensitive and personal information. “Something’s not right,” ACLU writes. “If these companies can really do all that they advertise, the FTC needs to figure out how.” At this point, we simply don’t know with certainty which, if any, social media platforms are permitting data brokers to obtain personal information from their platforms – information that can then be sold to the government. Regardless of the answer to that question, PPSA suggests that a thorough way to short-circuit any extraction of Americans’ most sensitive and personal information from data sales (at least at the federal level) would be to pass the strongly bipartisan Protect Liberty and End Warrantless Surveillance Act. This measure would force federal government agencies to obtain a warrant – as they should anyway under the Fourth Amendment – to access the data of an American citizen. |
Categories
All
|