A new study from Washington Post reveals that police routinely use facial recognition software to identify and arrest suspects, yet fail to disclose it to the defendants themselves. This, despite the fact that that the still-new technology has led to numerous documented false arrests. Washington Post spoke with 100 police departments across 15 states, although only 30 of them provided records from cases in which facial recognition was used. In fact, the investigation found that the police often overtly masked their use of the software, recording in reports, for example, that suspects were identified “through investigative means.” There’s reason for that; facial recognition software is notoriously fallible. The article references at least seven cases of wrongful arrests stemming from the use of the technology. Six of those seven were Black Americans. Washington Post reports, “[f]ederal testing of top facial recognition software has found the programs are more likely to misidentify people of color, women and the elderly because their faces tend to appear less frequently in data used to train the algorithm….” Last year, we wrote about the case of Randall Reid, a Black man from Georgia arrested for allegedly stealing handbags in Louisiana. The only problem: Reid had never even been to Louisiana. He was a victim of misidentification. And that was all the police needed to hold him for close to a week in jail. Generally speaking in the criminal context, facial recognition software works by comparing surveillance footage with publicly available photos online. Companies like Clearview AI contract with law enforcement agencies, providing access to billions of photos scraped from Facebook, X, Instagram and other social media platforms. And despite access to so much online material, the results are often faulty. Which is all the more reason that such evidence needs to be disclosed in an investigative context. Per the Post, “Clearview search results produced as evidence in one Cuyahoga County, Ohio, assault case included a photo of basketball legend Michael Jordan and a cartoon of a Black man.” Spoilers: neither image depicted the culprit. The real culprit in this case is a legal system that is decidedly behind the times on reacting and responding to technological shifts. Some are catching up; in 2022, the ACLU won a legal victory against Clearview mandating the company to adhere the Illinois Biometric Information Privacy Act (BIPA). The law requires companies that collect, capture, or obtain a biometric identifier of an Illinois resident to first notify that person and obtain his or her written consent. But we have a long way to go in establishing vigorous protections against the misuse and masking of “iffy” new technologies like facial recognition. Due process requires we do better. Comments are closed.
|
Categories
All
|