Columbia’s Knight Institute Goes to Court to Find Out As we’ve noted, a veritable gaggle of organizations (including a service called Gaggle) are helping schools to monitor student activity on district-issued devices – tracking every website, every keystroke (and potentially snapping pictures of students’ private lives). These arrangements lack transparency. Parents are only told it’s necessary to ensure “public safety” or some version of “safeguarding student mental health.” In the meantime, school districts and taxpayers are shelling out millions to the ed tech industry. And all that collected data? Surveillance companies like GoGuardian and Gaggle have signed a Student Privacy Pledge that they will not sell students’ personally identifying information. Despite pledges from school districts and tech companies, more clarity is needed about who can access students’ information and why. This inscrutable practice of student monitoring is about to get a little more attention – in the form of a lawsuit aimed at unearthing the facts. Attorney Jennifer Jones of the Knight First Amendment Institute describes the student surveillance industry in detail and makes the legal case against it in the Teen Vogue online newsletter. The Knight Institute’s lawsuit isn’t the first of its kind, but its timing amid the cultural chaos of artificial intelligence suggests it could be a tipping point for transparency. This lawsuit is also not about specific privacy violations alleged by individuals, so it won’t be settled for damages as some previous cases have been. On paper, student surveillance systems sound great: The monitoring is designed to prevent self-harm, cyberbullying, and violence. And yet, as Jones points out, the standard list of related keywords and websites the software provides can be customized – making it capable of going far beyond universal safety concerns to serve the political or cultural agenda du jour. What happens if a student tries to access a banned book, for example? Should that be reported? This is all just one search word away from a dystopian episode of the Twilight Zone. As has been reported from multiple quarters, there is scant and merely anecdotal evidence that any of these systems accomplish what they purport to – but evidence of plenty of misfires. Moreover, the law on which this burgeoning surveillance apparatus is based, the Children’s Internet Protection Act of 2000, requires no measures beyond basic obscenity filters. The ed tech industry has done a bait and switch to take advantage of well-intentioned school administrators who are desperate to solve some of the most heartbreaking problems of our time. It would be nice if AI-powered surveillance was the quick fix, but it’s not. It is a blunt force instrument with chilling implications up and down the Bill of Rights. We don’t need to normalize an educational-corporate-juridical surveillance state. The answers to the problems of school violence and self-harm are not easy, and they won’t be solved by technology alone. They must be mitigated through connection and relationships: Talking not stalking. So it’s time for a reckoning, and a conversation that brings all of us to the table. We hope the Knight First Amendment Institute’s lawsuit makes that candid and open conversation happen. Here’s some suggested further reading: Superman Isn’t the Only One with X-Ray Vision: Apparently, Your Wi-Fi Can See Through Walls Too4/11/2025
We are reminded of a Vice story in 2023 that should have received much more attention than it did. Then again, it can be challenging to keep up when new threats to privacy seem to emerge daily. That story, with a very recent twist, is thus: It turns out that Wi-Fi is capable of sensing human presence, potentially even pinpointing location, determining posture/position, and tracking movement. Unsurprisingly, the underlying technology is courtesy of Facebook’s AI team, using optical methods like cameras. Now, Carnegie-Mellon researchers have realized that Wi-Fi is the perfect vehicle for solving “limitations” with the original optical approach, limitations such as not being able to see people in the dark or behind furniture. And that’s not creepy at all, is it? Call us old-fashioned but we just have the feeling that terrible things could come from being able to spy on people in the dark. And we unequivocally declare that the rudimentary nature of what Carnegie-Mellon has “accomplished” won’t remain that way for long. Believe us when we say, technologists will figure this out and in very short order. Because, wouldn’t you know it, Carnegie-Mellon’s iteration is just the latest in a long line of “Wi-Fi sensing” advancements. According to MIT, it’s a broad field with the potential to “usher in new forms of monitoring.” They predict that in effect it’s the future of motion detection technology. Only that future is now. Verizon, Origin Wireless, Cognitive Systems Corporation, AXIS, and Infineon all have services on offer that use some form of Wi-Fi sensing. The goals – “health metrics,” “elder safety,” “home security” – sound commendable, as is always the case with privacy incursions and surveillance overreach. The commercial and social justifications are also appealing, even compelling. But what happens when someone with less-than-wholesome intentions gains access? To say that we need robust guardrails around such technology is the epitome of understatement. And the time to build those guardrails for private and public use of this technology? Probably 2023. Congratulations to Director of National Intelligence Tulsi Gabbard for launching a serious effort at intelligence community (IC) reform. On Tuesday, Director Gabbard announced a “Task Force to Restore Trust in the Intelligence Community and End Weaponization of Government Against Americans.” Rather than saddle Washington with an unwieldy new acronym, TFRTICEWGAA, this task force will be known as the Director’s Initiatives Group (DIG). “I established the Director’s Initiative Group to bring about transparency and accountability across the IC,” Director Gabbard said in a statement. She lists many DIG priorities that are familiar hobby horses of this administration, though they are admittedly responses to deep and serious abuses – from official and secret government censorship during the Biden administration, to weaponization of government for political purposes. What we find most intriguing about DIG is its charge to engage in mass declassification. We’ve long called out the absurd lengths the federal government goes to stamp “classified” on even the most innocuous documents, often in conflict with executive orders to declassify. In this new effort we see enormous potential for DIG to inform Congress and the American people of key facts regarding oversight of intelligence community programs. A few are:
For years, PPSA has used FOIA and legal action to try to force the government into revealing how often it has “unmasked” – or internally revealed the identity – Members of Congress whose communications get picked up in surveillance. We also want to know if the agencies are using these surveillance authorities, whether Section 702 or purchased data, to surveil Members of Congress on the House and Senate Judiciary and Intelligence Committees, those with specific oversight of the intelligence community. Director Gabbard has undertaken a strong and necessary corrective within the intelligence community – and one from the top, no less. Despite her position, she will no doubt encounter resistance and obfuscation along the way. But if she presses forward, Director Gabbard can reinforce the power of Congress to create guardrails and constitutional protections on programs that operate in near darkness. On a summer day in 1915 a commercial attaché for the German embassy fell asleep on a train, only to awaken with jolt to realize he was at his stop. In his haste to depart, the diplomat left behind his briefcase – stuffed with all the details of Germany’s clandestine spy ring against the United States and plans to cross America’s northern border to wage mayhem against British Canada. An American agent tailing the German made the correct decision to grab the case rather than continue to follow his target. This is just one of the engrossing stories in The Triumph of Fear, a new book by Patrick Eddington, senior fellow in homeland security and civil liberties at the Cato Institute. Eddington traces the trajectory of rising government surveillance from the Spanish-American War under William McKinley to the Cold War under Dwight D. Eisenhower. Along the way, Eddington details how the interception of telegrams and the passage of the 1917 Espionage Act set the legal and institutional basis for today’s surveillance state and the government’s digital spying on the American people. The contents of the diplomat’s briefcase proved the Woodrow Wilson administration was right to be paranoid about Germany’s intentions. But almost all of Germany’s covert actions were conducted by German agents and nationals, not by sympathetic Americans. This did not keep President Wilson from tarring Americans who objected to the U.S. entry into the war or who opposed the draft by peaceful, political means as traitors. President Wilson said: “There are citizens of the United States, I blush to admit, born under other flags but welcomed under our generous naturalization laws to the full freedom and opportunity of America, who have poured the poison of disloyalty in the very arteries of our national life … to debase our politics to the uses of foreign intrigue.” In service of this all-out war on dissent, Wilson secured passage of the Espionage Act, which continues to give the government broad powers to prosecute Americans for being perceived as helping hostile powers. One official warned “postal employees to be on the lookout for material that might ‘embarrass or hamper the government.’” Before long, the government had created an informal, national network of snitches, a milder version of the later East German Stasi apparatus. The precursor of the FBI, the Federal Bureau of Investigation, had been reading telegrams from Western Union and major communications providers since the Spanish-American War. Few Americans of stature objected. One of them, Sen. Robert LaFollette, the progressive Republican from Wisconsin, warned Americans that “private residences are being invaded, loyal citizens of undoubted integrity and probity arrested, cross-examined, and the most sacred constitutional rights guaranteed to every American citizen are being violated.” The courts were no bulwark against this trashing of the Constitution. The U.S. Supreme Court upheld the conviction of Charles Schenck for mailing leaflets to draft-age men asking them to take political action to oppose the draft. Before long, the government felt free to deport anarchist Emma Goldman and put Eugene Debs, the socialist candidate for president, in prison for opposing the war. Eddington writes: “The American national security state, created in peace and vastly expanded during war, would now become a permanent feature of national life, complete with enduring, draconian national security laws.” If you want to know how we got here, Triumph of Fear is an entertaining read and an essential one. It also casts a mirror on the current state of surveillance and speech. Today, as in the Wilson era, we are challenged to separate explicit calls from violence from controversial speech. Today, as then, the government warrantlessly inspects Americans’ movements, associations, and statements, but with infinitely more precision and more data than could be reaped just by reading telegrams. “This is as about as far from the Founders’ vision of the Fourth Amendment as one can imagine" House Members asked leading civil liberties experts to testify this morning on the “continued pattern of government surveillance of American citizens.” Gene Schaerr, PPSA general counsel, testified before the Subcommittee on Crime and Government Surveillance, setting out the dimensions of the federal government’s spying on Americans. He also spoke optimistically that Congress can rein in these practices. Here’s an excerpt from his written statement: “We have seen under administrations of both parties the expansion of myriad forms of privacy-destroying technologies and practices – elements of an emerging American surveillance state being knitted together before our eyes. “Like the proverbial frog unaware that it is slowly being boiled alive, Americans are being progressively trapped in a system of national surveillance. This is not happening because federal agencies are run by tyrants. The men and women in the intelligence community are passionate about their mission to protect the American people and our homeland. But in their zeal to execute their important mission, they are rapidly creating the elements of a pervasive American surveillance state. And astonishingly fast changes in technology are helping build this surveillance state before our laws can catch up to keep it within the constraints of our Constitution. “At airports, at malls, on the streets, we are identified and tracked by our faces. Cellsite simulators in geofenced areas ping our phones to follow our movements. Our automobiles keep a record of every place we drive. Our digital devices at international terminals are subject to having all their contents downloaded and inspected without a warrant. Moreover, thanks to purchases of Americans’ digital information from data brokers, federal agencies ranging from the FBI to the IRS, Department of Homeland Security, and the Department of Defense, routinely access, without a warrant, digital information far more personal than what can be gathered by hand or found in a diary. To top it off, we also face the routine collection of Americans’ communications ‘incidentally’ caught up in the global data trawl of programs authorized by Section 702, and in the past few years alone the FBI has conducted hundreds of thousands of warrantless searches of the Section 702 database specifically looking for Americans’ communications. “The end result is that the government is now able to collect and search through vast amounts of Americans’ communications and other personal data with ineffective statutory limits and limited congressional oversight. The personal data thus obtained reveals much about our health, mental health, and personal relations. Worse, all this data generated from myriad sources can then be woven together by the instant power of artificial intelligence to comprehensively track where we go, who we meet with, what we say or share in private, and what we believe. As a result, federal agencies are capable of generating comprehensive political, religious, romantic, health, and personal dossiers on every American from information gathered without a warrant. “This is as about as far from the Founders’ vision of the Fourth Amendment as one can imagine. Revulsion at government surveillance runs deep in our DNA as a nation; indeed, it was one of the main factors that led to our revolt against British rule and, later, to our Bill of Rights. Agents of the Crown could break into a warehouse or a home to inspect bills of lading or a secret political document, but they couldn’t access anything close to the wealth of private information contained in our digital lives today. “Month by month, it is harder to square this emerging surveillance state with the ‘consent of the governed’ concept articulated in the Declaration of Independence and embodied in Article I of the Constitution. The Founders believed that American citizens should not be subject to surveillance by their own government without their consent – in the form of a statute duly enacted by their representatives in Congress. They should not be subject to surveillance at the whim of any executive official, none of whom has authority to consent to surveillance on their behalf … “In the face of a surveillance state growing at breakneck speed, this Committee has shown leadership and a sense of urgency that matches the moment. We don’t have to supinely accept the erosion of all privacy. We don’t have to trust that government agents and future administrations will always use these awesome powers solely for national security. These technologies simply offer too much power to trust that future guardians will not be tempted to misuse them, as they have done in the past. “In short, you have shown that you can protect both the constitutional rights of your constituents and also keep them safe from foreign and domestic threats. I urge you to uphold the Constitution by once again advancing – and persuading your fellow Members to adopt – a warrant requirement for both government-purchased data and data collected under Section 702.” You can read Gene Schaerr’s full testimony here, and watch the full hearing here. PPSA General Counsel Set to Testify Before Congress on Alarming Government Surveillance Practices4/7/2025
Our General Counsel, Gene Schaerr, will be providing a testimony in the House tomorrow where he will address the problems with domestic surveillance and provide workable solutions that we at PPSA along with our allies are fighting for. WASHINGTON, D.C. - The House Judiciary Subcommittee on Crime and Federal Government Surveillance will hold a hearing on Tuesday, April 8, 2025, at 10:00 a.m. ET. The hearing, "A Continued Pattern of Government Surveillance of U.S. Citizens," to examine the government's abuse of its surveillance authorities, including the Foreign Intelligence Surveillance Act, the government's purchasing of data, and new and emerging technologies like facial recognition. The hearing will also discuss past legislative efforts to protect Americans' civil liberties and constitutional right to privacy under the Fourth Amendment and identify additional potential legislative solutions. WITNESSES:
Jeremy Bentham, the Enlightenment era philosopher of utilitarianism, sketched out the concept of a Panopticon – a prison designed to keep inmates under constant inspection by guards. What are the psychological consequences of knowing that one is being watched constantly? Last year we reported that SciTechDaily reported on an Australian study revealing that people who know they are being surveilled become hyperaware of faces, recognizing others faster than a control group. They become a little jumpy, always on the lookout to categorize someone as benign or a potential threat. And those results came from knowing that one is being surveilled by a camera. What happens to the mental health and social life of people who are being watched not only by gear and gadgets, but also by government agents tailing them everywhere? Imagine putting out the garbage, going to a store, or picking up the kids from school only to see a familiar stranger across the street watching you. This is the fate of “defector families” in North Korea. When someone defects from North Korea, the government punishes the defector’s relatives by subjecting them to persistent, relentless surveillance. NK News profiles one such family who went through elaborate procedures to obtain internal travel documents to attend a family wedding. They turned back when they realized that their wedding party would also include a full complement of government agents tailing them and recording their every utterance and move. “They went home to avoid making their relatives uncomfortable or causing problems on such an important day,” NK News reported. A source told an NK reporter: “These people live in an invisible prison, constantly anxious because everything they do is being watched. This surveillance and pressure cause severe psychological pain. One defector’s family described their difficulties, saying they must live their entire lives feeling like criminals from the moment they’re branded as having a defector relative. They gradually began avoiding people because having every breath, meal, and word monitored and reported became unbearable.” The United States is not North Korea. But we should not kid ourselves that the mounting surveillance of Americans – by facial recognition, by the tracking of our phones and cars, by the purchasing of our personal data – is free of a psychological cost. Is What the Supposed Terror-Watch Program Is Really Being Used for If this were a political thriller, “Quiet Skies” might be Russia’s clandestine government surveillance program being used to eliminate enemies of the state by poisoning their tea with polonium every time they take a flight. In reality, “Quiet Skies” is the Transportation Security Administration’s secret spying program for the Air Marshal Service. First outed by the Boston Globe in 2018, Quiet Skies singles out potentially dangerous flyers for close attention and inspection (“enhanced observation”). Enhanced observation is a 45-minute process that squeezes every inch of clothing, inspects the lining of suitcases, and requires a live review of every electronic device (meaning take it out, turn it on, and hand it over). Two bomb-sniffing canine teams and a plainclothes TSA supervisor may also be involved and, in the sky, up to three Air Marshals are tasked with watching these suspected passengers’ every move. “SSSS” is TSA’s boarding pass designation for this treatment, which suggests that no focus groups or historians were consulted beforehand. Such inspections in many cases are undoubtedly necessary to track bad actors intent on doing harm to the United States. As people who fly often with our family members, we are glad the government is on the lookout for the next potential shoe-bomber. Whistleblowers have indicated that the program, however, is also being abused as a means of targeting political opponents rather than as a $400-million-dollar anti-terrorist safety net. Just ask Tulsi Gabbard, who was targeted in 2024 after returning from Rome with her husband. By then, of course, the Iraq War veteran and former Democratic representative had become the Biden Administration’s persona non grata du jour after she endorsed and campaigned for Donald Trump. With Gabbard now the Director of National Intelligence, we hope that Rep. Tim Burchett’s (R-TN) request for answers as to why Gabbard was targeted will now see the light of day. Was she simply unlucky in being randomly chosen for this treatment, which has happened to one of us? If politics is involved in any way, that would be a very serious misuse of security policy. You don’t have to be a fan of Director Gabbard to see how such an authority could be misused by any administration in any direction. Employing such tools to surveil political opponents is how republics fall. An administrative subpoena is a contradiction in terms – a compulsory government demand for records issued without a judge. It is a tool that bypasses the judiciary, sidestepping the Fourth Amendment’s core protection of neutral oversight. PPSA filed a Freedom of Information Act request with the Department of Justice’s Office of the Inspector General (OIG) seeking clarity on how administrative subpoenas are used – specifically, whether they require probable cause and whether any have ever been denied for lacking it. On those points, the OIG said it had no records. But it did release one document – a 15-page internal manual that shows how investigators issue subpoenas – often without court involvement and sometimes without notifying the target. The manual makes one thing clear: subpoena power isn’t just held by top DOJ officials. It’s been pushed down the chain to the very investigators working the cases. FBI Special Agents in Charge (SACs) in field offices can issue subpoenas on their own authority – no judge, no internal check, no outside approval. Unless the target is someone “sensitive,” such as a journalist, judge, or senior government official, nobody else has to sign off. These SAC-issued subpoenas can grab a lot, including names, addresses, phone logs, session times, and payment details from phone and internet providers. They can also pull records from hotels, rental car agencies, utility companies, and more. If financial records are involved, agents can delay telling the customer for up to 90 days. But in many cases, the manual doesn’t require telling the person at all. The government often collects this data quietly, without the target ever knowing. And the courts? They only show up if someone refuses to comply. At that point, the OIG might ask a judge to enforce the subpoena. But that’s the exception. Most subpoenas never see a courtroom. The OIG has no records showing that it applies any standard, like probable cause, before issuing them. And its manual doesn’t lay out a clear evidentiary threshold. That means there’s no neutral party reviewing the request, and no formal limit on how broad or invasive it can be. This might be legal under current statutes, but it doesn’t square with the U.S. Constitution. The Fourth Amendment is meant to protect us from unreasonable government demands for our private information. That protection means more than just saying ‘no’ to searches. It means requiring the government to justify its snooping before it happens. When agents can issue their own subpoenas without a judge’s okay, and collect sensitive personal data without notice, those safeguards vanish. And when the data involved reveals what people believe, where they go, and who they talk to, it’s not just a privacy issue. It’s a First Amendment problem, too. No government investigator should have the power to demand private records without meaningful guardrails. If the government wants access to your private records, it should meet clear standards and operate under real oversight. When free speech, a free press, or freedom of association are on the line, the protections should be even stronger. PPSA urges Congress to put limits on administrative subpoenas before they quietly erode the rights our nation’s founders set out to protect. Would You Like a Side of Malware with Your PDF Conversion? The threat landscape is growing again. This time, reports Forbes contributor Zak Doffman, the FBI is warning Americans about online utility sites, especially those offering free online document converter tools or a tool for downloading audio or video files (MP3, MP4, etc.). Basically if it’s “online” and “free” and purports to do something you really need done – just say no. It’s not worth it. Yes, these sites “work” in that you may well get your converted file or the downloading program you need. But you’re also likely to get your sensitive information stolen and malware or ransomware installed on your device. And while there are legitimate utility sites out there, the scam sites will try to mimic their URLs. So, unless you know the site from prior experience and can trust it, or unless the site has been vetted by your tech team or the cyber gurus in your life, then don’t engage. Better yet, don’t enter “free online document converter” in your search bar in the first place. It’s worth investing in official tools for all such tasks. Because not having your information stolen or your computer invaded is a value at any price. As facial recognition and biometric scanning systems expand to 400 U.S. airports, Sen. Jeff Merkley (D-OR) is asking if this could be the beginning of a U.S. surveillance state. In a video interview with Philip Wegman of RealClearPolitics, Sen. Merkley said: “I'm concerned about the way facial recognition is used to encroach upon freedom and privacy around the world. We see China enslaving a million Uyghurs, and a tool they use is facial recognition software. It's so inexpensive and pervasive; if you put that power in the hands of a government, you can't know where it's going to go. “This is not the kind of tool you want to give to the government in a free country. You would never know you have the ability to opt out at any airport where they're doing this program." The Corporate Transparency Act (CTA) Gets Reined In The Corporate Transparency Act’s plan to surveil 32 million American small businesses has been stopped cold. On March 26, the Treasury Department published an interim final rule that removes the onerous beneficial ownership reporting requirement. From now on, only foreign entities are required to report or update the personal information of anyone who owns 25 percent or more of a given business. There are good ways to track the money networks of terrorists, drug dealers, and other criminals. But asking hard-working American small business owners to spend hours and money to report information that doesn’t reveal any of that information was an idea whose time will deservedly never come. We still look forward to the day when the “Repealing Big Brother Overreach Act” can be signed into law and the Corporate Transparency Act will be dismantled in toto. No one expects “foreign reporting companies” to be transparent about which criminals might happen to own their businesses anyway. In the meantime, Treasury’s Financial Crimes Enforcement Network needs to find more realistic ways to safeguard the financial system from illicit activity – or at least be honest about its intent to extend surveillance over Americans’ financial transactions under the guise of flawed legislation like the CTA. FBI PSA: The Safe Bet Is to Assume It’s Fake Remember when the only person you worried might fall prey to scammers was your favorite aunt, who had only her Welsh Corgi at home with her during the day? “Now, Trixie,” you’d say, “don’t agree to anything and always call me first.” Those days are over. Forget your late aunt Trixie. Worry about yourself. Imagine if you received a phone call from a close friend, family member, even your spouse that was actually an utterly-convincing AI-generated version of that person’s voice – urgently begging you to provide a credit card number to spring her out of a filthy jail in Veracruz or pay an emergency room hospital bill. The age of AI augers many things, we are told. But while we’re waiting for flying taxis and the end of mundane tasks, get ready to question the veracity of every form of media you encounter, be it text, image, audio, or video. In what is sure to be the first of many such public service announcements, the FBI is warning that the era of AI-powered fraud hasn’t just dawned, it is fully upon us. The theme of the FBI’s announcement is “believability.” It used to be that scams were easy to spot – the writing was laughably bad, or the video and audio were noticeably “off” or even a little creepy – a phenomenon known as the Uncanny Valley effect. The newfound power of generative AI to produce realistic versions of traditional media has put an end to such reliable tells. Anyone who thinks they’re immune to such trickery misunderstands the nature of generative AI. Consider:
Whenever a friend or family member sends a video that clearly shows him or her in need of help (stranded on vacation or having their wallet stolen at a nightclub perhaps), don’t automatically assume it’s real no matter how convincing it looks. And thanks to generative AI’s “vocal cloning” ability, a straight-up phone call is even easier to fake. So, what can we do? The FBI advises: Agree to a secret password, phrase, or story that only you and your family members know. Do the same with your friend groups. Then stick to your guns. No matter how close your heartstrings come to breaking, if they don’t know the secret answer, it’s a scam-in-waiting. The FBI also recommends limiting “online content of your image or voice” and making social media accounts private. Fraudsters scrape the online world for these artifacts to produce their deepfake masterpieces. All generative AI needs to create a convincing representation of you is a few seconds of audio or video and a handful of images. Rest in peace, Aunt Trixie. We miss her and the good old days when all we had to do was warn her not to give her personal information to a caller who said he was from the Corgi Rescue Fund. Today, if an AI scamster wanted to, he could now have Aunt Trixie call you from the grave, needing money, of course. Your genetic blueprint is your most unique identifier, packed with deeply personal information. How might it be used? Your DNA could be subpoenaed by law enforcement to connect you to an investigation. It could be used to predict your predisposition to a disease, prompting an insurance company to raise your premiums. It can also compromise the privacy of your children and other relatives up, down, and across your family tree. Seven million 23andMe customers learned this the hard way in 2023 when hackers gained access to their family trees, birth years, and geographic locations. If you’ve ever sent in a saliva test for a 23andMe genetic profile, you should seriously consider having it and your data destroyed NOW. This is because 23andMe is going into voluntary Chapter 11 restructuring and could be sold – and with it, all your supremely private information the company holds. Here are instructions from California Attorney General Rob Bonta on how to destroy your sample and delete your genetic data with 23andMe. Other DNA home-testing sites also offer delete functions in their account settings. On one day in 2010 Blake Robbins, 15, a high school sophomore, was relaxing in his bedroom popping Mike and Ike candy, “fruity, chewy candy … bursting with five fun flavors.” He was soon called to the principal’s office at Harriton High School, in a community west of Philadelphia. Blake was accused of selling drugs. Blake, along with 2,000 other students, had received a laptop computer from the school district that he was allowed to take home with him. What parents were not told was that the laptops’ cameras would activate and transmit an image every 15 minutes – capturing teenagers in their bedrooms, and any family members who happened to cross in the path of the very-watchful eye. Keron Williams, an African-American honors student, says images were used to profile him to promote a false accusation that he had been stealing. In all, it is alleged that 56,000 webcam images of students and their families were captured through the donated laptops. Keep an eye out for more on this story on Spy High, a documentary produced by Mark Wahlberg, that will stream on Amazon April 8. (Check out the Spy High trailer on People.com.) You might dismiss this as an old story – and one that was well reported in the local media. It was also adjudicated in the courts. The Robbins family received a $610,000 settlement from the school district. But this story remains startlingly relevant, in two ways. First, the incidents behind Spy High were not outliers but omens of things to come. As we reported last year, Gaggle safety software is reviewing student messages and flagging issues of concern. In one Kansas high school, students in a high school art class were called in to defend the contents of their art portfolio. Software had flagged digital files of their art for “nudity.” A report compiled by the Center for Democracy & Technology found that over 88 percent of schools use some form of student device monitoring, 33 percent use facial recognition, and 38 percent share student data with law enforcement. Second, this story is relevant because it warns us that there are wide swaths of American officialdom that are either dismissive or blithely unaware of the Fourth Amendment and its warrant requirement. To be fair, there are plenty of disfunctions and dangers in the modern American high school that administrators need to anticipate and counter. But placing spyware over all student messages and content seems like overkill. The price we pay is that the next generation of Americans is learning to accept life in a total surveillance state. Imagine a law enforcement agent – an FBI agent, or a detective in a large police department – who wants to track people passing out leaflets. Current technology might use facial recognition to search for specific people who are known activists, prone to such activity. Or the agent could try not to fall asleep while watching hours of surveillance video to pick out leaflet-passers. Or, with enough time and money, the agent could task an AI system to analyze endless hours of crowds and human behavior and to eventually train it to recognize the act of leaflet passing, probably with mixed results. A new technology, Vision Language Models (VLMs), are a game-changer for AI surveillance, as a modern fighter jet is to a biplane. In our thought experiment, all the agent would have to do is simply instruct a VLM system, “target people passing out leaflets.” And she could go get a cup of coffee while it compiled the results. Jay Stanley, ACLU Senior Policy Analyst, in a must-read piece, says that a VLM – even if it had never been trained to spot a zebra – could leverage its “world knowledge (that a zebra is like a horse with stripes.)” As this technology becomes cheaper and commercialized, Stanley writes, you could simply tell it to look out for kids stepping on your lawn, or to “text me if the dog jumps on the couch.” “VLMs are able to recognize an enormous variety of objects, events, and contexts without being specifically trained on each of them,” Stanley writes. “VLMs also appear to be much better at contextual and holistic understandings of scenes.” They are not perfect. Like facial recognition technology, VLMs can produce false results. Does anyone doubt, however, that this new technology will only become more accurate and precise with time? The technical flaw in Orwell’s 1984 is that each of those surveillance cameras watching a target human required another human to watch that person eat, floss, sleep – and try not to fall asleep themselves. But VLMs make those ever-watching cameras watch for the right things. In 1984, George Orwell’s Winston Smith ruminated that: “It was terribly dangerous to let your thoughts wander when you were in a public place or within range of a telescreen. The smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself – anything that carried with it the suggestion of abnormality, of having something to hide." Thanks to AI – and now to VLMs – the day is coming when a government official can instruct a system, “show me anyone who is doing anything suspicious.” Coming soon, to a surveillance state near you … Can the Government Access “An Entire Haystack Because It May Contain a Needle?” The drafters of the U.S. Constitution could not have imagined Google, Apple, and cell-site technologies that can vacuum up the recorded movements of thousands of people. Still smarting from the British colonial practice of ransacking rows of homes and warehouses with “general warrants,” the founders wrote the Fourth Amendment to require that warrants must “particularly” describe “the place to be searched, and the persons or things to be seized.” Courts are still grappling with this issue of “particularity” in geofence warrants – technology that analyzes mass data to winnow out suspects. Now a federal court in Mississippi has come down decisively against non-particular searches in location-and-time based cell tower data. To reach this conclusion, Judge Andrew S. Harris had to grapple with a Grand Canyon of circuit splits on this question. His opinion is a concise and clear dissection of divergent precedents from two higher circuit courts. Harris begins with the Fourth Circuit Court of Appeals in Virginia in United States v. Chatrie (2024), which held that because people know that tech companies collect and store location information, that a defendant has no reasonable expectation of privacy.” The Fourth Circuit reached its decision, in part, because Google users must “opt in to Location History” to enable Google to track their locations. The Fifth Circuit Court of Appeals in New Orleans took the Fourth Circuit’s reasoning and chopped it up for jambalaya. The Fifth drew heavily on the U.S. Supreme Court’s 2018 United States v. Carpenter opinion – which held that the government’s request for seven days’ worth of location tracking from a man’s wireless carrier constituted an unconstitutional search. This data, the Supreme Court reasoned, deserves protection because it provides an intimate window into a person’s life, revealing not only his particular movements, but through them his “familial, political, professional, religious, and sexual associations.”’ Despite a long string of cases holding that people have no legitimate expectation of privacy when they voluntarily turn over personal information to third parties, the U.S. Supreme Court held that a warrant was needed in this case. The Fifth followed up on Carpenter’s logic with a fine distinction in United States v. Smith (2024): “As anyone with a smartphone can attest, electronic opt-in processes are hardly informed and, in many instances, may not even be voluntary.” That court concluded that the government’s acquisition of Google data must conform to the Fourth Amendment. The Fifth thus declared that geofence warrants are modern-day versions of general warrants and are therefore inherently unconstitutional. That finding surely rattled windows in every FBI, DEA, and local law enforcement agency in the United States. Judge Harris worked from these precedents when he was asked to review four search-warrant applications for location information from a data dump from a cell tower. The purpose of the request was not trivial. An FBI Special Agent wanted to see if he could track members of a violent street gang implicated in a number of violent crimes, including homicide. The government wanted the court to order four cell-service provides to produce data for 14 hours for every targeted device. Judge Harris wrote that the government “is essentially asking the Court to allow it access to an entire haystack because it may contain a needle. But the Government lacks probable cause both as to the needle’s identifying characteristics and as to the many other flakes of hay in the stack … the haystack here could involve the location data of thousands of cell phone users in various urban and suburban areas.” So Judge Harris denied the warrant applications. Another court in another circuit may have well come to the opposite conclusion. Such a deep split on a core constitutional issue is going to continue to deliver contradictory rulings until it is resolved by the U.S. Supreme Court. In the meantime, Judge Harris – a graduate of the University of Mississippi Law School – brings to mind the words of another Mississippian, William Faulkner: “We must be free not because we claim freedom, but because we practice it.” Withdraw $200 from an ATM and You Might Just Be a Target of Federal Financial Surveillance3/18/2025
If you are walking the streets of Laredo, Texas, and you withdraw $200 from your account at an ATM, under a new rule your personal identifying information will soon be dispatched to the Financial Crimes Enforcement Network (FinCEN) of the U.S. Treasury Department. The same would happen if you withdrew $200 in 30 zip codes in El Paso, or in Cameron, Hildalgo, Maverick, or Webb counties in Texas, or San Diego and Imperial counties in California. In all, this new regulation announced by the U.S. Treasury Department will require banks to report Americans for the supremely suspicious act of withdrawing $200. These consumers will then become the targets of Currency Transaction Reports along the U.S.-Mexican border. The impetus, says the agency, is “deep concern with the significant risk to the U.S. financial system of the cartels, drug traffickers, and other criminal actors along the Southwest border.” But $200 sounds like a measly threshold for coyotes who charge illegal immigrants thousands to cross the border, and drug cartels that often make deals with barrels of cash. A $200 withdrawal certainly doesn’t sound like a risk to the U.S. financial system – or a likely indication of criminal activity. But it is no surprise that the bureaucracy is taking advantage of President Trump’s reasonable designation of international drug cartels as terrorist organizations. FinCEN has long been at the center of efforts to make financial surveillance of Americans comprehensive. This is the same agency that worked with the FBI to encourage financial institutions across the country to scour their data and file Suspicious Activity Reports without any clear criminal nexus. Suspicious activities that could have made an American a surveillance target under that now-discontinued program included merely shopping at certain stores, like Dick’s Sporting Good or a Bass Pro Shop. Perhaps the feds also included as a basis for surveillance laughing at Jeff Foxworthy jokes – on the theory that if you are buying Dick’s camo shorts, you just might be a redneck. But this is not a joke. More than one million Americans will soon be unable to withdraw a very modest sum of money without being subjected to the same reporting requirements and surveillance risk under the Bank Secrecy Act as those who make $10,000 cash withdrawals in the rest of the country. The larger issue is why any American should be subjected to warrantless surveillance based on withdrawing a dime of his or her own hard-earned money. The basic concept is hard to square with the Fourth Amendment. This is a dispiriting sign that the financial surveillance of the American people continues and even increases unabated. Nicholas Anthony of CATO, who broke this story, noted that Americans were upset when the previous administration lobbied Congress for the authority to surveil bank accounts with just $600 in activity. While that law never passed, Treasury’s new rule now subjects one million Americans living in a wide swath of the country to surveillance at just a third of that amount. Perhaps the best withdrawal would be a revocation of this new rule. Is It a Felony to Ask for Pictures of Your License Plate? Here's a philosophical question for you: If no one searches for the information stored in a database, does that mean the information doesn't exist? It may be right there – where Column 32 meets Row 743 – but if no one has executed a search, has it been “found” or “seen” yet? Does it even exist? Now hang on to that curious idea for a moment and we’ll circle back. Recall that we recently commended the nonprofit periodical Cardinal News for publishing an investigative series on the growing use of surveillance technology by local police in Southwestern and South Central Virginia. As part of their investigation, Cardinal News drove through nearly 20 cities, towns, and counties, then used Virginia’s Freedom of Information Act (FOIA) to request the video surveillance data of their vehicle. And what was the result of these FOIA requests?
The city of Roanoke and the Botetourt County Sheriff want the City Circuit Court to rule whether they “really have to” provide the data Cardinal News requested. In their complaint, Roanoke and the Botetourt Sheriff make three less-than-compelling arguments:
A final note: As Cardinal News points out, Virginia law says computers can’t be used to gather identifying information – i.e., account numbers, credit card numbers, biometric data, fingerprints, passwords, or other truly private information. “That’s what the statute is protecting,” the newspaper argues. In other words, the law is not meant to protect you from your own license plate number. Where does such chutzpah come from? This FOIA response perhaps shows that local government is learning from the mental gymnastics and rhetorical sleights-of-hand that federal agencies have mastered in fobbing off lawful requests. We look forward to seeing how these too-clever-by-half arguments will fly in front of a Virginia judge. Stay tuned. It seems that China is excelling of late in the artificial intelligence arena, and we’ll cover two such instances today. The first is the launch of the game-changing large language model DeepSeek, which turned its Western competitors on their ears. Faster, less expensive, and more customizable than the rest, it is also brazenly forthright about its lack of privacy protections. As Zak Doffman of Forbes points out in his cybersecurity analysis of DeepSeek, buried deep within the product’s Privacy Policy are declarations like this: “The personal information we collect from you may be stored on a server located outside of the country where you live. We store the information we collect in secure servers located in the People's Republic of China.” As for what they collect, specifically, Doffman says they are unambiguous: everything. See for yourself in detail. And to think we worried about TikTok. “Just ask what a powerful AI engine in state hands could do with all that personally identifiable information,” Doffman muses. “This is strategic in a way TikTok never was.” The second instance of this “you can’t spell CHINA without an ‘A’ and an ‘I’” moment is an update on a phenomenon about which Kay Firth-Butterfield, CEO of Good Tech Advisory, recently reminded us: China is building the AI that powers your children’s toys. From robotic pets to interactive storytelling dolls to remote-control vehicles, as a market segment, AI toys are on target to grow to $40 billion in the next seven years. Laurent Belsie of The Christian Science Monitor found himself casting a wary eye on the whole scene as Christmas approached last year. Some of the growth will be obvious – last year it was Poe the AI Story Bear – but Belsie reports that within two years many makers will have stealthily added AI capabilities to their existing toys. What does all of this have to do with China? Upwards of 80 percent of the world’s toys and their components are currently manufactured there. So when AI comes for (er, to) your children’s toys, it’s likely to be of Chinese design as well. And all that data generated by interactive, conversational – even potentially camera-based – AI toys have to be stored somewhere, as experts like Firth-Butterfield and others remind us. Where, exactly, is increasingly coming into focus. It’s one thing if adults are profligate with their own data (downloading DeepSeek so quickly that it became the top free app on iTunes within a week of its release, for example). It is another when it comes to privacy of children. EFF Touts New Rayhunter Detector We’ve long followed reliance on stingrays by federal, state, and local law enforcement. These are devices that simulate cell phone service towers to fool nearby devices into connecting and giving up everything – texts, calls, emails, and more, along with the location of the cellphone and information about the user/owner. Law enforcement uses stingrays to target specific criminals, but the problem is – as is so often the case with surveillance technologies – the data of everyone in the vicinity gets swept up, including that of peaceful protesters. These sweeps pose a direct threat to the most precious rights Americans have – the First Amendment rights to free speech and to petition the government for a redress of grievances. Protests are not some Sixties-style fad that never went away. The right to protest is as home-grown as the Boston Tea Party, the Million Mom March, and the March for Life. Yet there are numerous reports of stingrays and similar technologies being used by authorities to clandestinely spy on large-scale public protests. Most disturbing is the insistence by the FBI to keep any use of a stingray in specific cases a state secret. Based on documents obtained through PPSA Freedom of Information Act requests, we know that the FBI has used nondisclosure agreements to force local jurisdictions to hide the fact whenever stingrays are used, even in open court. Now, thankfully, the Electronic Frontier Foundation has gone beyond protesting and filing court briefs to work with technologists willing to roll up their sleeves and get out the soldering iron. EFF is presenting an open-source tool to help detect stingray use. The aptly named Rayhunter will set you back only about $30, which is the cost of the hardware, the Orbic RC400L hotspot you’ll need (check Amazon, eBay, or any of your geeky uncles). Once in hand, simply follow the instructions on EFF’s open-source Rayhunter website. As the Rayhunter gets out into the market, protesters of all stripes will be able to know if their First Amendment-protected activities are being surveilled – and to livestream the results. Other steps should be taken by FBI Director Kash Patel or by Congress. Director Patel or Congress should mandate full disclosure about the origin of all evidence collected by a stingray and presented in court against a criminal defendant. Every American has the right to face his or her accuser and be confronted with the evidence against them, even when that evidence is digital and the result of proprietary technology. For now, let us applaud the Electronic Frontier Foundation for giving Americans the all-too-rare chance to answer the question, “Am I being surveilled?” At the very least, Americans engaging in their First Amendment-protected right to protest can know if the government is turning their own phones against them. Rep. Davidson, Sen. Tuberville Reintroduce Bill to Free Small Businesses from Invasive Overreach As we’ve reported, the Corporate Transparency Act (CTA) requires owners of America’s 33 million small businesses to report detailed personal data on anyone with at least a 25 percent stake in their company. This law represents that most dangerous of all mixtures – overreach and nonsense. The stated purpose of this law is to catch crooks. So the ownership disclosure requirement in effect says: “Dear Terrorist (or Cartel Member or Money Launderer), would you kindly tell us who owns at least 25 percent of your company? Having this information would make building a case against you so much easier. So please check this box if you’re a criminal – Sincerely, the Feds.” Such unassailable logic reminds us of the old standup routine that advises people to check their closets before bedtime for a possible axe murderer while he’s still hiding. Do that and you will be safe... somehow. Fortunately, CTA’s days may be numbered. Rep. Warren Davidson (R-OH) has re-introduced what he calls the “Repealing Big Brother Overreach Act.” (A better name might be “Repealing the ‘Do You Think Criminals Are That Stupid Act’?”) Not only does the Corporate Transparency Act fail to accomplish what it sets out to do (catch criminals), it also targets a completely irrelevant group in the process – the average American small business owner, forcing him or her to register with a massive federal database that can be accessed without a warrant. Your local barbershop, accountant’s service, and gym are the targets. Big businesses, financial entities, and more are exempt from CTA’s provisions, which only threatens small business owners with large fines and two years in prison if they don’t comply. It doesn’t make sense that you can stop terrorists, drug dealers, and money launderers by going after honest small businesses. If this “beneficial ownership” provision ever went into effect, it is highly likely that the first fines and prosecutions would be against honest business owners who missed the filing deadline rather than a terrorist or money launderer. PPSA believes that the government’s insatiable hunger to track ordinary Americans is the real intent behind this law. This is all in keeping with the recent extension of surveillance over Americans’ financial transactions. In the meantime, and thanks to a flurry of back-and-forth court rulings (see our filing before the 11th Circuit Court of Appeals) as well as new guidance from the Treasury Department, reporting beneficial ownership information is currently voluntary. As of today, no penalties will be associated with failing to report. Treasury is also recommending a rule revision that limits the reporting requirements to foreign entities only. The stars seem to be aligning in favor of Rep. Davidson’s bill, with Alabama Republican Tommy Tuberville sponsoring it in the Senate. If this bill makes it to the Resolute Desk, President Trump is all but certain to sign it. But now is the time to keep the pressure on. Let your representatives in the House and Senate know that you support the “Repealing Big Brother Overreach Act.” Americans value privacy in the marketplace when we vote with our dollars no less than when we go behind the curtains of a polling booth. Now imagine if every dollar in our possession came with an RFID chip, like those used for highway toll tags or employee identification, telling the government who had that dollar in their hands, how that consumer spent it, and who acquired it next. That would be the practical consequence of a policy proposal being promoted now in Washington, D.C., to enact a Central Bank Digital Currency (CBDC). Some have recently asked Congress to attach such a currency to the Bank Secrecy Act, to enable surveillance of every transaction in America. Such a measure would end all financial privacy, whether a donation to a cause, or money to a friend. “If not designed to be open, permissionless, and private – resembling cash – a government-issued CBDC is nothing more than an Orwellian surveillance tool that would be used to erode the American way of life,” said Rep. Tom Emmer (R-MN). This would happen because CBDC is a digital currency, issued on a digital ledger under government control. It would give the government the ability to surveil Americans transactions and, in the words of Rep. Emmer, “choke out politically unpopular activity.” The good news is that President Trump is alert to the dangers posed by a CBDC. One of his first acts in his second term was to issue an executive order forbidding federal agencies from exploring a CBDC. But the hunger for close surveillance of Americans’ daily business by the bureaucracy in Washington, D.C., is near constant. There is no telling what future administrations might do. Rep. Emmer reintroduced his Anti-Surveillance State Act to prevent the Fed from issuing a CBDC, either directly or indirectly through an intermediary. Rep. Emmer’s bill also would prevent the Federal Reserve Board from using any form of CBDC as a tool to implement monetary policy. The bill ensures that the Treasury Department cannot direct the Federal Reserve Bank to design, build, develop, or issue a CBDC. Prospects for this bill are good. Rep. Emmer’s bill passed the House in the previous Congress. It doesn’t hurt that Rep. Emmer is the House Majority Whip and that this bill neatly fits President Trump’s agenda. So there is plenty of reason to be hopeful Americans will be permanently protected from a surveillance currency. But well-crafted legislation alone won’t prevent the federal bureaucracy from expanding financial surveillance, as it has done on many fronts. PPSA urges civil liberties groups and Hill champions of surveillance reform, of all political stripes and both parties, to unite behind this bill. We’re not sure which is most disconcerting: that Meta has a division named Global Threat Disruption, that their idea of said global threats includes deepfake celebrity endorsements, or that this has become their excuse to reactivate the controversial facial recognition software they shelved just three years earlier (so much for the “Delete” key). Meta has relaunched DeepFace to defend against celebrity deepfakes in South Korea, Britain, and even the European Union. “Celeb-baiting,” as it’s known, is where scammers populate their social media posts with images or AI-generated video of public figures. Convinced that they’re real – that Whoopi Goldberg really is endorsing a revolutionary weight loss system, for example – unwitting victims fork over their data and money with just a few clicks. All of which, according to Meta, “is bad for people that use our products.” Celeb-baiting is a legitimate problem, to be sure. We’re no fans of social media scammers. What’s more, we know full well that “buyer beware” is meaningless in a world where it is increasingly difficult to spot digital fakes. But in reviving their facial recognition software, Meta may be rolling out a cannon to kill a mosquito. The potential for collateral damage inherent in this move is, in a word, staggering. Just ask the Uighurs in Xi’s China. Meta began tracking the faces of one billion users, beginning in 2015. And initially, it didn’t bother to tell people the technology was active, so users couldn’t opt out. As a result of Meta’s sleight of hand, as well as its own strict privacy laws, the EU cried foul and banned DeepFace from being implemented. But that was years ago … and how times have changed. The privacy-minded Europeans are now letting Meta test DeepFace to help public figures guard against their likenesses being misused. But can regular users be far behind? Meta could rebuild its billion-face database in no time. For its part, the U.K. is courting artificial intelligence like never before, declaring that it will help unleash a “decade of national renewal.” Even for a country that never met a facial recognition system it didn’t love, this feels like a bridge too far. We have written about the dangers, both real and looming, of a world in which facial recognition technology has become ubiquitous. When DeepFace was shelved in 2021, it represented an almost unheard-of reversal, in effect putting the genie (Mark Z, not Jafar) back in the bottle. That incredibly lucky bit of history is unlikely to repeat itself. Genies never go back in their bottles a second time. “We are open for business,” declared Beth Williams, the only board member currently serving on the five-seat Privacy and Civil Liberties Oversight Board (PCLOB). “Our work conducting important oversight of the intelligence community has not ended just because we are currently sub-quorum.” A more accurate description for the board would be “solum unum.” One of the first acts of the Trump Administration was to fire the Democratic PCLOB members, leaving Republican Williams by herself. Perhaps anticipating this, PCLOB’s board members shortly before the election adopted new rules that would allow any remaining board members – aided by the body’s professional staff of lawyers, policy analysts, and technologists – to continue to publish its recommendations to the intelligence community, and to share those with Congress and the public. In a recent speech, Beth Williams spelled out commendable goals for ongoing efforts for her PCLOB of one. Censorship: “Tying disfavored speech to counter-terrorism paves the way for censorship under the guise of national security,” Williams said. She complained that the Department of Homeland Security under Secretary Alejandro Mayorkas had been slow in responding to her requests for detailed information about the activities of the department’s Orwellian-sounding “Disinformation Governance Board.” Williams added: “I am hopeful that our renewed efforts with the current Administration will yield more transparency.” Facial Recognition in Airports: Williams promises to weigh the operational benefits of this technology with concerns about privacy and civil liberty concerns. Debanking: As with censorship, Williams says she is concerned about the government conflating “disfavored persons” with terrorism, leading to the “debanking” of people and organizations. The Consolidated Audit Trail: Without any statutory basis, the Securities and Exchange Commission under former Chairman Gary Gensler assembled a database that monitors the identity, transactions, and investment portfolios of everyone who invests in the stock market. “Government surveillance of Americans’ financial activities – especially in the name of counter-terrorism – is ripe for oversight,” Williams said. Section 702: PPSA has long worked to make sure that the Fourth Amendment’s warrant requirement applies to Americans whose communications are incidentally caught up in Section 702 of the Foreign Intelligence Surveillance Act. But Williams and her former colleague Richard DiZinno dissented from PCLOB’s Democratic majority support for a warrant requirement in 2023. Williams has previously called for “structural and cultural reforms” to the way in which the FBI accesses Americans’ information. The FBI has since tightened Section 702 querying procedures, and Congress has enacted reforms increasing the FBI’s reporting requirements to Congress. Williams appears content that these changes are enough to rest easy on Section 702. We disagree. The FBI reviewed Americans’ communications 3.4 million times a few years ago, and more than 200,000 times in the most recent report. The bureau has accessed the personal information of Members of Congress, political donors, and journalists without a warrant. “Is 200,000 warrantless queries better than 3.4 million warrantless queries?” Elizabeth Goitein of the Brennan Center for Justice’s liberty and national security program said to The Washington Post in 2023. “When you ask the question, you get a sense of how warped the universe we’re in is – that somehow 200,000 warrantless searches a year are an acceptable number.” At the very least, we hope Williams will see that this is a valid perspective. PPSA hopes that that Beth Williams – lacking peers as sounding boards – will reach out to the civil liberties community to hear the perspectives and the questions that would have come from her departed peers. Board Member Williams, can we meet? |
Categories
All
|