The House Intelligence Committee recently held an open hearing on commercial cyber surveillance, also known as “mercenary spyware.”
The hearing focused on new threats posed specifically by privately made, foreign-developed spyware that are bringing capabilities long associated with top-tier nation states to smaller countries and the private sector. PPSA has previously reported on one such foreign spyware, in particular the spreading “zero-click” Israeli-developed Pegasus.
Pegasus can transmit itself seamlessly into a smartphone without a single click or action from the victim. From there, it can watch you through your camera, listen to you through your microphone, copy your messages, record your calls, extract all your images, and follow your movements. In just a few years, Pegasus has been acquired by dozens of countries and entities, from Saudi Arabia to Mexican cartels, and has already been used to deadly effect against dissidents and journalists. It represents the most sophisticated and widely available form of spyware yet developed.
Among the hearing’s testimonials was John Scott-Railton, a senior researcher at The Citizen Lab of the University of Toronto's Munk School of Global Affairs & Public Policy. His testimony provided a stark picture to Congress:
Railton testified (see the 18:50 mark), “Your phone can be on your bedside table at two in the morning. One minute, your phone is clean. The next minute, the data is silently streaming to an adversary a continent away. You see nothing.” He added it was “capabilities available only to a handful of nation-states … It is too late,” he said, “to put the tech back into the bottle, and so we must take strong action now…”
Another witness was Carine Kanimba, an American citizen born in Rwanda. Her testimony (29:05) details the story of her stepfather, Paul Rusesabagina, portrayed by Don Cheadle in Hotel Rwanda. Rusesabagina was the manager of the Hôtel des Mille Collines in Kigali during the Rwandan genocide. He used the hotel to save more than a thousand refugees. Later, he and his family fled to the United States. Rusesabagina became a public speaker and was critical of the human rights violations of the Rwandan government and of the Rwandan President Paul Kagame. In August 2020, Kanimba’s stepfather was surveilled in the United States by the Rwandan government and lured from the family home in Texas. Rusesabagina was kidnapped in Dubai, transferred to Kigali, tortured, tried, and sentenced to 25 years in prison. Kanimba became a vocal and effective activist about the abduction of her stepfather.
In February 2021, Carine Kanimba was notified (33:11) by forensics experts that her smartphone had been infected by Pegasus.
“I was mortified, and I am terrified,” she said. The forensics report showed “the spyware was triggered as I walked in with my mom into a meeting with the Belgian Minister of Foreign Affairs. It was active during the calls with the U.S. Presidential Envoy for Hostage Affairs team and the U.S. State department, as well as U.S. human rights groups.”
Not only was Kanimba’s phone infected, but so was the phone of her cousin with whom she lives.
“I am frightened by what the Rwandan government will do to me and my family next,” she said. “It keeps me awake that they knew everything I was doing. Where I was, who I was speaking with, my private thoughts and actions, at any moment they wanted. Unless there are consequences for countries and their enablers which abuse this technology, none of us are safe.”
The threat by mercenary spyware companies and malware is too serious to ignore.
“It has taken us too long to have this conversation,” concluded Railton. His testimony included several suggestions for Congress (22:15):
Being called out by the People’s Republic of China for illicit surveillance is a bit like being accused of swindling by Charles Ponzi.
Chinese state media seized on a recent report based on a two-year exhaustive study by the Center on Privacy and Technology at Georgetown Law that revealed the U.S. Immigration and Customs Enforcement (ICE) is the latest federal agency to buy vast quantities of Americans’ personal data from utilities and state motor vehicle departments.
As PPSA has previously reported, the Center on Privacy and Technology found that ICE has used facial recognition technology to search the driver’s license photographs of 1 in 3 adults in the United States. ICE has access to the driver’s license data of 3 in 4 American adults and tracks the movements of cars in cities that are home to nearly 3 in 4 adults. And when adults in our country connect to gas, electricity, phone or internet service, ICE will automatically pick up the new addresses of 3 out of 4 Americans.
“The U.S. is the No. 1 empire in hacking, eavesdropping and stealing secrets,” said Zhao Lijian, spokesman for China’s Ministry of Foreign Affairs, on Monday. “This is an irrefutable fact and a brilliant satire of the U.S. boasting about human rights, the rule of law and rules.”
That is rich. China has installed a pervasive national system that uses artificial intelligence to weave together cameras in public and private spaces, facial recognition, sound recorders with voice recognition, and Orwellian “social credit scores” to create what scholars call the Chinese Panopticon.
It is galling to be attacked for abuses by a regime that keeps its citizens under such pervasive surveillance. But the hypocrisy of China’s bee sting does not quite pull out the stinger.
In the United States, at least 16 U.S. federal agencies and 75 local and state agencies employ “stingray” devices that mimic cell towers to compromise the information in cellphones within wide areas. As many as 3,000 local and state agencies rely on facial recognition technology. Federal agencies routinely sidestep the Fourth Amendment requirement to obtain a probable cause warrant to scan our personal information by purchasing it from shadowy, private data brokers.
And when all else fails, U.S. intelligence agencies claim to be able to perform any surveillance they deem necessary for national security not under any law, but under a presidential directive, Executive Order 12333.
Much of this information is used by the government to catch illegal aliens, predatory criminals, terrorists, and spies (most of them, by the way, from China). None of it will be used to put ethnic minorities in concentration camps, imprison men and women of conscience for challenging the regime’s lack of democracy, or grade us on our willingness to scroll through the Dear Leader’s turgid thoughts.
But we should take stock – the state of surveillance in the United States is a lot more like China’s than we’d like to admit. Absent reasonable legal reforms and guidelines, we could well be on our way to a Chinese Panopticon-light.
Tenth Circuit on Right-to-Record in Irizarry v. Yehia
The Fourth Amendment grants us protection against intrusive surveillance. Conversely, the First Amendment grants us the right to observe public actions by public authorities. The emergence of the cellphone demonstrates the integral nature of these two sets of rights. Courts are increasingly interpreting First and Fourth Amendments regarding cellphones to the advantage of citizens over government, a victory for civil liberties in law if not always in practice.
The U.S. Supreme Court in Riley v. California (2014) held that the police violate the Fourth Amendment when they try to gain warrantless access to the voluminous personal information inside our cellphones. On the other hand, the First, Third, Fifth, Seventh, Ninth, and Eleventh Circuit Courts of Appeal have upheld the right to record police officers going about their public duty, a right recognized as critical to the protections of the First Amendment.
Last summer, PPSA reported on the continued holdout stance by the U.S. Tenth Circuit Court of Appeals against the right to film police officers. Despite the weight of six other Courts of Appeal, the Tenth Circuit continued to insist that there was no “clearly established” right. In a recent ruling, however, the Tenth Circuit came close to fully joining its judicial peers by dropping its Draconian opposition to the right to record in the case of a self-identified journalist and blogger. On July 11th, the court ruled in Irizarry v. Yehia in favor of a right to record.
The incident in question occurred early in the morning of May 26, 2019, when blogger Abade Irizarry began filming a DUI traffic stop in Colorado. According to the ruling of the court, “Officer Ahmed Yehia arrived on the scene and stood in front of Mr. Irizarry, obstructing his filming of the stop. When Mr. Irizarry and a fellow journalist objected, Officer Yehia shined a flashlight into Mr. Irizarry's camera and then drove his police cruiser at the two journalists.”
PPSA welcomes the court’s adjustment on the right to record police activity, fundamental to the First Amendment and to Americans’ ability to protect themselves in court against potential police misconduct. The Tenth Circuit specifically cited the rulings of other Courts of Appeal, indicating that the right to record may be gaining traction, especially amid the public backlash against police misconduct in the wake of the killing of George Floyd.
PPSA urges courts to interpret the First and Fourth Amendments in ways that reinforce these rights. They are not in competition. There is – and should be –
a lopsidedness in the law. Citizens are free to film the police on official duty. But the police must obtain a warrant to search our cellphones.
In a free society that holds authority accountable, that is as it should be.
The U.S. Supreme Court held in Riley v. California in 2014 that cellphones are not like other objects. The texts, emails, instant messages, online searches, and apps inside a phone can reveal just about everything about us, what the Court called “the privacies of life.”
The Court ruled that the police need to obtain a probable cause warrant to investigate a suspect’s cellphone. But what are the rules if the cellphone is abandoned or thrown away? Courts are currently applying the law governing ordinary abandoned objects to cellphones.
This question arises from the case of a Virginia man, Antonio Daren Futrell, who realized that he had left his cellphone inside a restaurant. He tried to retrieve it, but it was past closing time, and the employees wouldn’t let him back in. There was an altercation and, long story short, Futrell was later convicted of firing a gun at a security guard before fleeing the scene. When the police found Futrell’s phone inside the restaurant – which was now considered abandoned after Futrell fled – they were able to access it because Futrell had not protected it with a passcode.
Now lawyers for Futrell have filed a petition asking the U.S. Supreme Court to clarify the question of whether a police officer who finds a discarded phone has free access to anything inside it.
“If you throw your phone away or discard it or trade it in, police can do whatever they want,” said Brandon Boxler, one of Futrell’s attorneys, told The Daily Press of Newport News. “They can access your emails, your bank records, your phone calls, text messages, photos – everything is fair game that’s on the phone.”
Futrell’s petition challenges Hester v. United States, a 1924 case in which the Supreme Court allowed the warrantless search of a moonshine bottle a suspect threw away. The Court later applied that doctrine to objects as disparate as a pencil and drug paraphernalia thrown in the trash.
“Cellphones are different,” Boxler wrote in the Daily Press in 2021. “They have massive storage capabilities. A search of a cellphone involves a much deeper invasion of privacy. The depth and breadth of personal and private information they contain was unimaginable in 1924.
“We use cellphones as cameras, personal assistants, navigation devices, web browsers, and everything in between,” Boxler wrote. “And with advances in cloud computing, cellphones can access years – if not decades – of bank records, medical records, emails, location data, and other sensitive information. Can anyone really ‘abandon’ this information, even if they discard a cellphone?”
While the chances the Supreme Court will take up this petition are remote, Futrell’s attorneys were heartened last Thursday when the Court asked the Virginia Attorney General’s office to respond to the petition.
Last week, the media was astir that videos from Amazon’s Ring doorbell cameras were shared with police without their owners’ permission. The company insists that it did so in eleven extreme cases this year in response to situations in which life and limb endangered.
This may fly in the face of company policy stating that police can’t view recordings unless the footage is posted publicly or intentionally shared. But the low number of such incidents, revealed in a letter by an Amazon VP of public policy to Sen. Edward Markey (D-MA), suggests the company is being upfront. To be fair, the media would be ablaze if Amazon had stood by and allowed someone to be beaten to death.
The biggest issue with Amazon Ring is not that it ignores the need to seek the permission of its customers to share videos with police. The bigger problem is that this network of more than three million online cameras across the United States encourages its customers to voluntarily provide for the surveillance of entire neighborhoods. One message from the company to its customers reads: “If you would like to take direct action to make your neighborhood safer, this is a great opportunity.”
The company has agreements with 2,161 law enforcement agencies to access an app called Neighbors, a social media platform in which owners can post Ring camera footage and leave comments. The transformation of home security into a venue for social media encourages users to post videos online – all of it available to law enforcement “partners.”
Even more worrying, Amazon’s agreements with law enforcement allow officers to solicit Ring doorbell footage from customers for entire neighborhoods. Such video and audio surveillance may be fine for the customer, but what about passersby? And while the number of incidents in which footage was shared without permission currently remains low, what about the capacity for future abuse by Amazon and law enforcement?
It is concerning that all it would take for Ring cameras to become a form of constant mass surveillance would be a change of one company’s policy.
ACLU FOIA Lawsuit: Department of Homeland Security Collects 15 Billion Cellphone Locations Every Day
The American Civil Liberties Union performed an invaluable service for the American people today by releasing records from Department of Homeland Security agencies that demonstrate the sweep of the government’s routine violation of the Fourth Amendment by purchasing Americans’ personal data from data brokers.
The ACLU’s Freedom of Information Act lawsuit against DHS agencies includes Customs and Border Protection, Immigration and Customs Enforcement, the U.S. Secret Service, and the U.S. Coast Guard. This lawsuit is ongoing, but these first disclosures are eyepopping.
The ACLU lawsuit reveals:
“ACLU’s findings should concern every American with a cellphone,” said Bob Goodlatte, former Chair of the House Judiciary Committee and now Senior Policy Advisor to PPSA. “ACLU’s determined effort to expose the scale of government intrusion into our privacy is a monumental public service. With the House and Senate now holding hearings into these practices, Congress has every reason to require warrants to intrude into our digital lives by passing the Fourth Amendment Is Not for Sale Act.”
Bob Goodlatte will testify on the government’s practice of buying Americans’ personal data tomorrow before the House Judiciary Committee.
The New York Times today reports that a hacker who calls himself ChinaDan is offering to sell the personal data of more than 1 billion Chinese citizens collected by the government for 10 Bitcoin, or about $200,000.
We’ve often commented on the Chinese Panopticon, in which the government is integrating facial recognition software and location tracking with pervasive surveillance of social media posts and social connections to assemble complete digital dossiers on China’s people. This does not mean, however, that the Chinese state is a monolith of competence. “Although the country has been at the forefront of collecting masses of information on its citizens, it has been less successful in securing and safeguarding that data,” The Times reports. It added that the Chinese government is deeply concerned about its “leaky data industry.”
This is a solid story in which Times reporters diligently tested a large sample of the data, including making phone calls to households targeted by the leak to verify that the sample is accurate. But there is one major perspective missing from this story.
ChinaDan broke Chinese law by hacking a Shanghai police database to get this data. In China’s oppressive regime, he is likely risking his life for a payout. But if he were an American citizen, ChinaDan could own a private data brokerage company that could legally buy this data from major apps and social media companies and then sell our personal information to any private entity, or to any number of agencies of the United States government. He might also be able to legally sell Americans’ personal data in the other direction, to China.
Sensitive data points sold on digital markets include Americans’ location, our browsing histories, and demographic details, all captured to update a precise digital portrait of our interests, beliefs, actions, and movements. This information is then shared with hundreds of bidders in a digital auction.
Companies use this “bidstream” data to create a digital dossier that can predict our behaviors, map our past actions, and reveal our personal relationships.
In the United States, no hacking into a police database is necessary to obtain this data on the open market. By opening the federal wallet, the Defense Intelligence Agency, the Department of Homeland Security, the IRS, and other agencies enjoy warrantless access to our most personal information. The government asserts that the Fourth Amendment’s requirement for a probable cause warrant need not be respected if the government simply buys our data.
While The Times worries about the security of Chinese citizens, we might take a moment to realize that in this one respect things at home are even worse.
Letter Follows Leak About “Backdoor to Access User Data”
TikTok’s growth in the American market is explosive. In just the first quarter of 2022 alone, the short-video app has been downloaded 19 million times. In all, TikTok has about 80 million U.S. users – about one-fourth of the U.S. population.
TikTok is owned by a Chinese company, ByteDance. This is a matter of worry in Washington, since explicit Chinese law that requires Chinese companies share information with the intelligence agencies of the Chinese Communist Party. Given the massive torrents of data TikTok collects on its users, the Trump Administration proposed forcing ByteDance to sell off TikTok. The company allayed these concerns by promising to seal off any sensitive data from China. With TikTok storing U.S. user data in the United States, with backups in Singapore, the issued seemed resolved.
Then on June 17, Buzzfeed reported that leaked audio from dozens of TikTok internal meetings revealed that employees expressed concern U.S. user data was being accessed by China.
“I feel like these tools, there’s some backdoor to access user data in almost all of them,” one external auditor said.
“Everything is seen in China,” said a member of TikTok’s “trust and safety” department.
Another insider referred to a Beijing-based engineer as a “Master Admin” who has “access to everything.”
Now Brendan Carr, Republican member of the Federal Communications Commission, has written to CEOs Tim Cook and Sundar Pichai to ask that Apple and Google remove TikTok from their app stores.
Carr wrote that TikTok’s image as “an app for sharing funny videos or memes” is just “the sheep’s clothing.” He added:
“TikTok collects everything from search and browsing histories to keystroke patterns and biometric identifiers, including faceprints – which researchers have said might be used in unrelated facial recognition technology – and voiceprints. It collects location data as well as draft messages and metadata, plus it has collected the text, images, and videos that are stored on a device’s clipboard.”
TikTok has already paid $92 million to settle lawsuits that it clandestinely transferred vast amounts of Americans’ user data to China. India has banned TikTok, as have the U.S. military and many federal agencies. With so much criticism, TikTok is now working furiously on a project, codenamed “Project Texas,” to work out a deal with Oracle and the Committee on Foreign Investment in the United States to exclusively store sensitive user information on U.S. servers.
Commissioner Carr remains skeptical. “TikTok has long claimed that its U.S. user data has been stored on servers in the U.S., and yet those representations provided no protection against the data being accessed from Beijing.”
Some might find Carr’s demands to private businesses to be heavy handed. Whether or not Apple and Google should remove TikTok from their app stores, at a minimum Americans need to be aware that their personal data might be at risk.
When it comes to digital privacy, Americans feel like a well-dressed person caught in the rain without an umbrella. At first, you try to wait it out under an eave. Then you accept getting a little bit wet. Finally, when your clothes are thoroughly soaked, you just give up.
When it comes to digital privacy, Americans have long accepted we couldn’t get any wetter. The social media services and apps we use track and sell our location history, our contacts, our communications, our purchases and (most revealing) our web searches. These data points, like the dots in a pointillistic painting, create a portrait of users with great detail. These portraits are then sold by data brokers to government agencies and commercial entities.
A recent Apple commercial portrayed this process by putting a young woman’s virtual self on an auction block. In the ad, the heroine Ellie turns on Apple’s privacy devices, vaporizing her would-be auctioneers. But such controls on a smartphone only involve a small portion of the torrents of information that are collected about us and sold wholesale.
So just when many are ready to declare the death of privacy, a bicameral, bipartisan group of legislators have put forward a discussion draft of the American Data Privacy and Protection Act (ADPPA). In a House hearing on Tuesday morning, this bill drew robust discussion from civil rights groups, digital reformers, and industry-allied organizations. This legislation is the first attempt at a comprehensive, national approach to, in the words of House Energy and Commerce Committee Chairman, Rep. Frank Pallone put “consumers back in control of their data and protecting their privacy.”
Under ADPPA, companies would have to obtain consumers’ consent for them to collect, process or transfer sensitive personal information. Affirmative consent would be required before the data of children between ages 13 and 17 could be transferred. The Federal Trade Commission (FTC) would form a Young Privacy Marketing Division to police the use of children’s data.
Best of all, the shadowy world of data brokers would be exposed to sunlight, with a public online registry created by FTC and third-party audits of how these brokers share information with others.
ADPPA would preempt some state privacy laws, while granting an exemption for the Illinois Biometrics Information Privacy Act (recently used to extract a sweeping settlement in the privacy practices of facial recognition provider Clearview AI), and California’s Privacy Rights Act. Other states with recent privacy laws are preempted, which Govtech.com writes “reeks of backroom dealing.”
The current draft includes a limited private right of action, which would allow individuals to bring suits for privacy violations after giving industry four years to adjust. Federal Trade Commission enforcement would be strengthened, and state attorneys general would be empowered to act against data holders who violate ADPPA. Companies would be given a limited right to cure a problem, which would give them standing to seek injunctive relief.
The discussion that took place in the House Subcommittee on Consumer Protection and Commerce reveals serious legislation with major issues to resolve. Here are a few of them.
How far should preemption of state privacy laws go?
Colorado, Texas, Virginia, Utah, and Connecticut have passed their own privacy laws. Will they eventually be excluded from preemption along with those of California and Illinois? If they are, do we run the risk of balkanizing the internet?
“American consumers and businesses deserve the clarity and certainty of a single federal standard for privacy,” said Former FTC Commissioner Maureen Ohlhausen.
Can we protect personal data by degrees of sensitivity without degrading the ability of digital commerce to function?
One goal of the bill is to have data minimization, which tasks companies with using only data that is needed for a given transaction. But can a law define the limits of what is needed?
John Miller of the Information Technology Industry Council noted that one provision, “information identifying an individual’s online activities over time or across third party websites or online services” could create restrictions for routine browsing. Or, as Ohlhausen put it, the bill “creates uncertainty for routine operational uses of information that are necessary to serve customers and operate a business.”
How broad should the private right of action be for individuals?
“The current proposal inserts several procedural hurdles that will not reduce litigation costs but will block injured individuals from having their day in court,” said David Brody, managing attorney of the Digital Justice Initiative Lawyers’ Committee for Civil Rights Under Law. “The private right of action in the Act is weak and difficult to enforce.”
John Miller countered, “while it is true neither punitive nor statutory damages are permitted” under the bill’s private right of action, “the availability of attorney’s fees could encourage the filing of borderline meritorious cases by specialized attorneys charging exorbitant hourly rates.”
Should government purchases of Americans’ personal data be included in the bill?
One issue that was not addressed on Tuesday is the frequent sale of Americans’ personal data to the government, a problem addressed by the proposed Fourth Amendment Is Not For Sale Act. Any privacy solution should look beyond the private uses of data by businesses to those of law enforcement and intelligence agencies. After all, only the government can use your information to bang down your door at dawn and arrest you.
There were further debates about how the bill might impact the ability of companies to handle cybersecurity threats, and whether small businesses would get tagged with onerous provisions aimed at tech giants. The legislative process in the House and Senate will have to untangle these and many other knotty issues to make this law workable. Yet the hearing room echoed with statements of determination by leaders in both parties to make a national privacy law a reality.
In the early post-Cold War era, anti-Communist crusaders were often accused of being hysterical, seeing Communists under their beds. Now a report from Christopher Balding and Joe Wu, researchers at New Kite Data Labs, sees the Chinese Communist Party inside coffee makers in American homes. And they are not crazy.
This alarming report is a consequence of the Internet of Things (IoT), in which ordinary appliances are given smart applications to interact with each other, as well as to report on performance and consumer behavior. According to Balding, interviewed by The Washington Times, Chinese-made coffee makers gather and report information about customers’ names, their locations, usage patterns and other information. In hotels, a coffee maker could report to China types of payments and routing information.
Similar issues have been found with vacuum cleaners that respond to voice commands, baby monitors and video doorbells.
The Chinese government has famously built a “panopticon,” a ubiquitous surveillance network that seamlessly integrates facial recognition, social media activities, payments, and other data to potentially track every citizen of that country. IoT, by design but mostly by technological evolution, is rapidly scaling the capacity to bring universal surveillance into the homes of the world.