Suspect: “We Have to Follow the Law. Why Don’t They?" Facial recognition software is useful but fallible. It often leads to wrongful arrests, especially given the software’s tendency to produce false positives for people of color.
We reported in 2023 on the case of Randall Reid, a Black man in Georgia, arrested and held for a week by police for allegedly stealing $10,000 of Chanel and Louis Vuitton handbags in Louisiana. Reid was traveling to a Thanksgiving dinner near Atlanta with his mother when he was arrested. He was three states and seven hours away from the scene of this crime in a state in which he had never set foot. Then there is the case of Portia Woodruff, a 32-year-old Black woman, who was arrested in her driveway for a recent carjacking and robbery. She was eight months pregnant at the time, far from the profile of the carjacker. She suffered great emotional distress and suffered spasms and contractions while in jail. Some jurisdictions have reacted to the spotty nature of facial recognition by requiring every purported “match” to be evaluated by a large team to reduce human bias. Other jurisdictions, from Boston to Austin and San Francisco, responded to the technology’s flaws by banning the use of this technology altogether. The Washington Post’s Douglas MacMillan reports that officers of the Austin Police Department have developed a neat workaround for the ban. Austin police asked law enforcement in the nearby town of Leander to conduct face searches for them at least 13 times since Austin enacted its ban. Tyrell Johnson, a 20-year-old man who is a suspect in a robbery case due to a facial recognition workaround by Austin police told MacMillan, “We have to follow the law. Why don’t they?” Other jurisdictions are accused of working around bans by posting “be on the lookout” flyers in other jurisdictions, which critics say is meant to be picked up and run through facial recognition systems by other police departments or law enforcement agencies. MacMillian’s interviews with defense lawyers, prosecutors, and judges revealed the core problem with the use of this technology – employing facial recognition to generate leads but not evidence. They told him that prosecutors are not required in most jurisdictions to inform criminal defendants they were identified using an algorithm. This highlights the larger problem with high-tech surveillance in all its forms: improperly accessed data, reviewed without a warrant, can allow investigators to work backwards to incriminate a suspect. Many criminal defendants never discover the original “evidence” that led to their prosecution, and thus can never challenge the basis for their case. This “backdoor search loophole” is the greater risk, whether one is dealing with databases of mass internet communications or facial recognition. Thanks to this loophole, Americans can be accused of crimes but left in the dark about how the cases against them were started. Comments are closed.
|
Categories
All
|